Gradient Descent

I recently learned about a cool way to minimize functions (like the true mathie I am!) and that way is through Gradient Descent. It’s a method, that I personally was never taught in my Math degree, to analyse the classic linear regression problem.

Here’s how it works: say you have a function that is defined by some set of parameters (for example, a typical cost function). If you start at some initial value on that function, Gradient Descent will take “baby steps” (defined by you), iteratively, towards a set of parameters that minimize the function. This happens through the magical methods of calculus! More specifically, “stepping” proportional to the negative of the Gradient of the function at the initial value.

You can also have some fun by using your favourite coding method (Octave is free software that is similar to Matlab that is good for beginners!) and implement the following:

GradientDescent

Related links: 

https://www.gnu.org/software/octave/

http://en.wikipedia.org/wiki/Gradient_descent

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s