Descent direction
In optimization, a descent direction is a vector that points towards a local minimum of an objective function .
Computing by an iterative method, such as line search defines a descent direction at the th iterate to be any such that , where denotes the inner product. The motivation for such an approach is that small steps along guarantee that is reduced, by Taylor's theorem.
Using this definition, the negative of a non-zero gradient is always a
descent direction, as .
Numerous methods exist to compute descent directions, all with differing merits, such as gradient descent or the conjugate gradient method.
More generally, if is a positive definite matrix, then
is a descent direction at .{{cite book | author = J. M. Ortega and W. C. Rheinbold | title = Iterative Solution of Nonlinear Equations in Several Variables | pages = 243 | year = 1970 | doi = 10.1137/1.9780898719468
| isbn = 978-0-89871-461-6 }} This generality is used in preconditioned gradient descent methods.