Progress Report: Day 2
~4.5 hours of learning today
- Re-learned Perceptrons: net input function, activation function, objective function, and loss function.
- Learned Adaline single layer NN and its workings.
- First introduction to gradient descent, specifically stochastic gradient descent (SGD).
- Learned convexity and what a convex loss function means, including local and global minimums.
- Basic understanding of mean squared error (MSE) loss function.
Things to improve:
- Increase study hours (try 6 hours tomorrow)
- Remove phone from desk to avoid distractions (place phone in another room?)
- Build up on general ML/Math/Statistics knowledge to learn new material more easily.
Source:
Daily progress report (March 6, 2024):
— Lelouch 👾 (@lelouchdaily) March 6, 2024
Mood: meh
~3 hours of learning today :(
- started watching Google's ML crash course
- learned of different ways of reducing loss (L1, L2/MSE)
- realized I need to learn differential calculus, partial derivatives, gradient for gradient desc