I recently signed up for Machine Learning on Coursera (Andrew Ng, Stanford). In the first week of lectures, I already felt lost in the jargon. But nothing worth learning comes easy, so persistence, persistence, persistence.

The first concept I realized that I didn’t understand involved fitting the model. When I searched the web, I found many definitions. Trying to tie them together, in my own words, ‘fit the model’ means finding a function that can represent a pattern in a set of data.

I remember having to graph all sorts of functions in school. I’m of the age where we did this with pencil and paper. Visually, fitting the model involves overlaying a scatter plot of the data set with different kinds of graphed functions. The graphed function that touches the most points in the data set might be a good model for predicting new values for that kind of data.

Most humans easily see patterns in pictures, but, with a digital machine, functions work better and faster. To understand what machine learning is doing, I expect to dive into how graphs are built from functions (linear algebra, calculus) and how to predict new values based on probablity (statistics). I just need to take it one concept at a time.