Gradient method
In optimization, a gradient method is an algorithm to solve problems of the form
:
with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient.
See also
{{div col|colwidth=22em}}
- Gradient descent
- Stochastic gradient descent
- Coordinate descent
- Frank–Wolfe algorithm
- Landweber iteration
- Random coordinate descent
- Conjugate gradient method
- Derivation of the conjugate gradient method
- Nonlinear conjugate gradient method
- Biconjugate gradient method
- Biconjugate gradient stabilized method
{{div col end}}
References
- {{cite book | year=1997 | title=Optimization : Algorithms and Consistent Approximations
| publisher=Springer-Verlag | isbn=0-387-94971-2 |author=Elijah Polak}}
{{Optimization algorithms}}
{{DEFAULTSORT:Gradient Method}}
Category:Optimization algorithms and methods
Category:Numerical linear algebra
{{linear-algebra-stub}}