A review of Andrew Ng's machine learning course

This course is a gentle introduction to machine learning as the main idea in many of the lessons is to impart the intuition of how different machine learning techniques work. There is some math, but the notation shouldn't scare off anybody, personally writing down the math equations helped with getting the idea to stick with me rather than passively consuming the information. Programming-wise it's not too challenging as well, although it may be a little frustrating for a complete newbie to get started with MATLAB/Octave. Psuedocode comes up in the lectures often, but it's mostly algorithms that iterate with for-loops (gradient descent, backpropagation). The hardest part is possibly the mapping the mathematical formulas to code and wrapping one's head around vectorized code.

The first part of the course covers supervised learning methods under Linear Regression, Logistic Regression, simple supervised Neural Networks and Linear/Gaussian Kernel Support Vector Machines (large-margin classifiers). These techniques are covered briefly and again, the main thing is to impart the intuition of how the maths behind these methods work. Some more in-depth examples of what's covered are:

The second part of the course covers unsupervised learning/dimensionality reduction methods like K-Means, Principal Component Analysis and the motivation for doing so (visualization, remove redundant features, software engineering concerns)