Logo
Unionpedia
Communication
Get it on Google Play
New! Download Unionpedia on your Android™ device!
Download
Faster access than browser!
 

Line search

Index Line search

In optimization, the line search strategy is one of two basic iterative approaches to find a local minimum \mathbf^* of an objective function f:\mathbb R^n\to\mathbb R. The other approach is trust region. [1]

17 relations: Backtracking line search, Conjugate gradient method, Descent direction, Golden-section search, Gradient descent, Iteration, Loss function, Mathematical optimization, Maxima and minima, Nelder–Mead method, Newton's method in optimization, Pattern search (optimization), Quasi-Newton method, Secant method, Simulated annealing, Trust region, Wolfe conditions.

Backtracking line search

In (unconstrained) minimization, a backtracking line search, a search scheme based on the Armijo–Goldstein condition, is a line search method to determine the maximum amount to move along a given search direction.

New!!: Line search and Backtracking line search · See more »

Conjugate gradient method

In mathematics, the conjugate gradient method is an algorithm for the numerical solution of particular systems of linear equations, namely those whose matrix is symmetric and positive-definite.

New!!: Line search and Conjugate gradient method · See more »

Descent direction

In optimization, a descent direction is a vector \mathbf\in\mathbb R^n that, in the sense below, moves us closer towards a local minimum \mathbf^* of our objective function f:\mathbb R^n\to\mathbb R. Suppose we are computing \mathbf^* by an iterative method, such as line search.

New!!: Line search and Descent direction · See more »

Golden-section search

The golden-section search is a technique for finding the extremum (minimum or maximum) of a strictly unimodal function by successively narrowing the range of values inside which the extremum is known to exist.

New!!: Line search and Golden-section search · See more »

Gradient descent

Gradient descent is a first-order iterative optimization algorithm for finding the minimum of a function.

New!!: Line search and Gradient descent · See more »

Iteration

Iteration is the act of repeating a process, to generate a (possibly unbounded) sequence of outcomes, with the aim of approaching a desired goal, target or result.

New!!: Line search and Iteration · See more »

Loss function

In mathematical optimization, statistics, econometrics, decision theory, machine learning and computational neuroscience, a loss function or cost function is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost" associated with the event.

New!!: Line search and Loss function · See more »

Mathematical optimization

In mathematics, computer science and operations research, mathematical optimization or mathematical programming, alternatively spelled optimisation, is the selection of a best element (with regard to some criterion) from some set of available alternatives.

New!!: Line search and Mathematical optimization · See more »

Maxima and minima

In mathematical analysis, the maxima and minima (the respective plurals of maximum and minimum) of a function, known collectively as extrema (the plural of extremum), are the largest and smallest value of the function, either within a given range (the local or relative extrema) or on the entire domain of a function (the global or absolute extrema).

New!!: Line search and Maxima and minima · See more »

Nelder–Mead method

The Nelder–Mead method or downhill simplex method or amoeba method is a commonly applied numerical method used to find the minimum or maximum of an objective function in a multidimensional space.

New!!: Line search and Nelder–Mead method · See more »

Newton's method in optimization

In calculus, Newton's method is an iterative method for finding the roots of a differentiable function (i.e. solutions to the equation). In optimization, Newton's method is applied to the derivative of a twice-differentiable function to find the roots of the derivative (solutions to), also known as the stationary points of.

New!!: Line search and Newton's method in optimization · See more »

Pattern search (optimization)

Pattern search (also known as direct search, derivative-free search, or black-box search) is a family of numerical optimization methods that does not require a gradient.

New!!: Line search and Pattern search (optimization) · See more »

Quasi-Newton method

Quasi-Newton methods are methods used to either find zeroes or local maxima and minima of functions, as an alternative to Newton's method.

New!!: Line search and Quasi-Newton method · See more »

Secant method

In numerical analysis, the secant method is a root-finding algorithm that uses a succession of roots of secant lines to better approximate a root of a function f. The secant method can be thought of as a finite-difference approximation of Newton's method.

New!!: Line search and Secant method · See more »

Simulated annealing

Simulated annealing (SA) is a probabilistic technique for approximating the global optimum of a given function.

New!!: Line search and Simulated annealing · See more »

Trust region

Trust region is a term used in mathematical optimization to denote the subset of the region of the objective function that is approximated using a model function (often a quadratic).

New!!: Line search and Trust region · See more »

Wolfe conditions

In the unconstrained minimization problem, the Wolfe conditions are a set of inequalities for performing inexact line search, especially in quasi-Newton methods, first published by Philip Wolfe in 1969.

New!!: Line search and Wolfe conditions · See more »

Redirects here:

Line search method, Line-search, Linesearch, Linesearch method, Linesearch methods.

References

[1] https://en.wikipedia.org/wiki/Line_search

OutgoingIncoming
Hey! We are on Facebook now! »