~4.5 hours of learning today

  • Re-learned Perceptrons: net input function, activation function, objective function, and loss function.
  • Learned Adaline single layer NN and its workings.
  • First introduction to gradient descent, specifically stochastic gradient descent (SGD).
  • Learned convexity and what a convex loss function means, including local and global minimums.
  • Basic understanding of mean squared error (MSE) loss function.

Things to improve:

  • Increase study hours (try 6 hours tomorrow)
  • Remove phone from desk to avoid distractions (place phone in another room?)
  • Build up on general ML/Math/Statistics knowledge to learn new material more easily.

Source: