Categories
Machine Learning Study Notes

Machine Learning Gradient Descent

    The final part of the basic supervised machine learning trinity is the gradient descent algorithm. Given a hypothesis and a cost function, this algorithm iterates through different values of ‘theta,’ (remember this is a parameter set), to find something called the local optima for a particular data set. Right, but that explanation is […]

Categories
Machine Learning Study Notes

Machine Learning Parameters

    Time to introduce a vital concept in machine learning: parameters. Previously, I talked about ‘thetaZero’ and ‘thetaOne’ and their use in the linear hypothesis algorithm. The proper term for them is  ‘parameters’ and they belong to a set ‘theta’ commonly notated as It is best to think of ‘theta’ as where ‘n” is […]

Categories
Machine Learning Study Notes

Machine Learning A Cost Function

    I call this a cost function rather than the cost function because one cost function does not fit all. Cost functions vary depending on the kind of data set or machine learning model. For the purposes of machine learning, cost functions become the algorithms that measure the performance of hypotheses. This equation comes […]

Categories
Machine Learning Study Notes

Machine Learning The Linear Hypothesis

    The very first formula I learned in machine learning (and the first time I tried writing in LaTeX!) So pretty cool, but what does it mean? This is an example of a univariate hypothesis. ‘Univariate’ is a fancy way of saying that I have one variable (let’s call it ‘x’ for now) that […]

Categories
Machine Learning Study Notes

Machine Learning Displaying a Formula

I am realizing that machine learning likes mathematical notation. Time to figure out how to put a formula in a post. Since the 1980s, the go-to tool for publishing mathematical papers has been     It has a cool history, but to write my study notes, I just needed to figure out how to use […]

Categories
Machine Learning Study Notes

Machine Learning Three Algorithms

Linear regression, cost function and gradient descent sum up the first two weeks of Professor Ng’s Machine Learning class on Coursera. Since I am not aspiring to be a mathematician, I am going to refer to them as algorithms. Each of these algorithms has a particular place when designing a machine learning solution. Linear regression, […]

Categories
Machine Learning Study Notes

Machine Learning Fit the Model

I recently signed up for Machine Learning on Coursera (Andrew Ng, Stanford). In the first week of lectures, I already felt lost in the jargon. But nothing worth learning comes easy, so persistence, persistence, persistence. The first concept I realized that I didn’t understand involved fitting the model. When I searched the web, I found […]