13 relations: Broyden–Fletcher–Goldfarb–Shanno algorithm, Derivative-free optimization, Line search, List of numerical analysis topics, Local search (optimization), Luus–Jaakola, Mathematical optimization, Nelder–Mead method, Pattern search, Random optimization, Random search, Rosenbrock methods, Search optimization.
In numerical optimization, the Broyden–Fletcher–Goldfarb–Shanno (BFGS) algorithm is an iterative method for solving unconstrained nonlinear optimization problems.
Derivative-free optimization is a discipline in mathematical optimization that does not use derivative information in the classical sense to find optimal solutions: Sometimes information about the derivative of the objective function f is unavailable, unreliable or impractical to obtain.
In optimization, the line search strategy is one of two basic iterative approaches to find a local minimum \mathbf^* of an objective function f:\mathbb R^n\to\mathbb R. The other approach is trust region.
This is a list of numerical analysis topics.
In computer science, local search is a heuristic method for solving computationally hard optimization problems.
In computational engineering, Luus–Jaakola (LJ) denotes a heuristic for global optimization of a real-valued function.
In mathematics, computer science and operations research, mathematical optimization or mathematical programming, alternatively spelled optimisation, is the selection of a best element (with regard to some criterion) from some set of available alternatives.
The Nelder–Mead method or downhill simplex method or amoeba method is a commonly applied numerical method used to find the minimum or maximum of an objective function in a multidimensional space.
Pattern search may refer to.
Random optimization (RO) is a family of numerical optimization methods that do not require the gradient of the problem to be optimized and RO can hence be used on functions that are not continuous or differentiable.
Random search (RS) is a family of numerical optimization methods that do not require the gradient of the problem to be optimized, and RS can hence be used on functions that are not continuous or differentiable.
Rosenbrock methods refers to either of two distinct ideas in numerical computation, both named for Howard H. Rosenbrock.
Search optimization may refer to.