# Mathematical Food for Thought

Serves a Daily Special and an All-You-Can-Eat Course in Problem Solving. Courtesy of me, Jeffrey Wang.

• ## Meta

The Smaller The Better. Topic: Calculus. June 18th, 2007

Problem: Given a complicated function , find an approximate local minimum.

Solution: The adjective complicated is only placed so that we assume there is no easy way to solve to immediately give the solution. We seek an algorithm that will lead us to a local minimum (hopefully a global minimum as well).

We start at an arbitrary point . Consider the following process (for ), known as gradient descent:

1. Calculate (approximately) .

2. Set , where is a constant that can be determined by a linesearch.

It is well-known that the direction of the gradient is the direction of maximum increase and the direction opposite the gradient is the direction of maximum decrease. Hence this algorithm is based on the idea that we always move in the direction that will decrease the most. Sounds pretty good, right? Well, unfortunately gradient descent converges very slowly so it is only really useful for smaller optimization problems. Fortunately, there exist other algorithms but obviously they are more complex, such as the nonlinear conjugate gradient method or Newton’s method, the latter of which involves the computation of the inverse of the Hessian matrix, which is a pain.