I am recreating the primary material from Stanford CS 229, using R. These should accompany lectures that can be found on ml-class.org and the CS229 homepage.

- 0 Introduction
- 1.1 Linear Regression and the Loss Function
- 1.2 Gradient Descent
- 2 Linear Algebra Review
- 3 Multivariate Regression, Gradient Descent, and the Normal Equation
- 4 Logistic Regression and Classification
- 5.1 Learning Theory and the Bias/Variance Trade-off
- 5.2 Regularization

Currently, most of my source code is embedded in the posts, but I will be posting this on github in a more structured format as it progresses.

### Source Code

Source code for this project and others:

- Source code for my stanford.ml R package
- jorgeortiz85 using Scalala
- jandot using clojure
- joewandy using Octave
- ovatsus using F#

### Other Efforts

Others have gone through similar exercises in the past:

- Mechanistician (in matlab)
- Al3xandr3 (in R)
- "YGC" (in R)
- bursic (in Scala)
- AI in Motion (in Python)

### Study Groups

One of the neatest things about ml-class.org is how it's inspiring so many people to study the material. Here are some groups to study together:

- NYC Machine Learning (organized by Paul Dix; I live in NYC, so I wish I could attend...)
- "NYC's Stanford ML (Machine Learning) CS 229 MeetUp"

### Press Coverage

Let me know if you find other links related to the class and I'll post them!