- 0 Introduction
- 1.1 Linear Regression and the Loss Function
- 1.2 Gradient Descent
- 2 Linear Algebra Review
- 3 Multivariate Regression, Gradient Descent, and the Normal Equation
- 4 Logistic Regression and Classification
- 5.1 Learning Theory and the Bias/Variance Trade-off
- 5.2 Regularization
Currently, most of my source code is embedded in the posts, but I will be posting this on github in a more structured format as it progresses.
Source code for this project and others:
- Source code for my stanford.ml R package
- jorgeortiz85 using Scalala
- jandot using clojure
- joewandy using Octave
- ovatsus using F#
Others have gone through similar exercises in the past:
One of the neatest things about ml-class.org is how it's inspiring so many people to study the material. Here are some groups to study together:
- NYC Machine Learning (organized by Paul Dix; I live in NYC, so I wish I could attend...)
- "NYC's Stanford ML (Machine Learning) CS 229 MeetUp"
Let me know if you find other links related to the class and I'll post them!