请输入您要查询的百科知识:

 

词条 Descent direction
释义

In optimization, a descent direction is a vector that, in the sense below, moves us closer towards a local minimum of our objective function .

Suppose we are computing by an iterative method, such as line search. We define a descent direction at the th iterate to be any such that , where denotes the inner product. The motivation for such an approach is that small steps along guarantee that is reduced, by Taylor's theorem.

Using this definition, the negative of a non-zero gradient is always a

descent direction, as .

Numerous methods exist to compute descent directions, all with differing merits. For example, one could use gradient descent or the conjugate gradient method.

More generally, if is a positive definite matrix, then

is a descent direction

[1]

at .

This generality is used in preconditioned gradient descent methods.

1. ^{{cite book | author = J. M. Ortega and W. C. Rheinbold | title = Iterative Solution of Nonlinear Equations in Several Variables | pages = 243 | year = 1970 | doi = 10.1137/1.9780898719468 }}
{{DEFAULTSORT:Descent Direction}}

1 : Mathematical optimization

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/11/11 19:05:47