Gradient method

In optimization, a gradient method is an algorithm to solve problems of the form

:\min_{x\in\mathbb R^n}\; f(x)

with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient.

See also

References

  • {{cite book | year=1997 | title=Optimization : Algorithms and Consistent Approximations

| publisher=Springer-Verlag | isbn=0-387-94971-2 |author=Elijah Polak}}

{{Optimization algorithms}}

{{DEFAULTSORT:Gradient Method}}

Category:First order methods

Category:Optimization algorithms and methods

Category:Numerical linear algebra

{{linear-algebra-stub}}