Previous | Next --- Slide 62 of 67
Back to Lecture Thumbnails

Is gradient descent the only tool available for finding minimums of complex/non-convex functions?


I think Newton's method also should work?'s_method_in_optimization According to Wikipedia, Newton's method uses curvature information (i.e. the second derivative) to take a more direct route.

Sneaky Turtle

Gradient descent always ends up at a global minimum for convex functions, but only guarantees local minima in general.


Newton's method is also good option to find minimums but can be computationally costly due to the calculation of Hessian

Please log in to leave a comment.