Levenberg-Marquardt is a popular alternative to the Gauss-Newton method of finding the minimum of a function that is a sum of squares of nonlinear functions,
Let the Jacobian of be denoted , then the Levenberg-Marquardt method searches in the direction given by the solution to the equations
where are nonnegative scalars and is the identity matrix. The method has the nice property that, for some scalar related to , the vector is the solution of the constrained subproblem of minimizing subject to (Gill et al. 1981, p.?136).
The method is used by the command FindMinimum[f, x, x0] when given the Method -> Levenberg Marquardt option.
SEE ALSO: Minimum, Optimization
Bates, D.?M. and Watts, D.?G. Nonlinear Regression and Its Applications. New York: Wiley, 1988.
Gill, P.?R.; Murray, W.; and Wright, M.?H. "The Levenberg-Marquardt Method." §4.7.3 in Practical Optimization. London: Academic Press, pp.?136-137, 1981.
Levenberg, K. "A Method for the Solution of Certain Problems in Least Squares." Quart. Appl. Math. 2, 164-168, 1944.
Marquardt, D. "An Algorithm for Least-Squares Estimation of Nonlinear Parameters." SIAM J. Appl. Math. 11, 431-441, 1963.
From Wikipedia, the free encyclopedia
Jump to: navigation, search
In \o "Mathematics" mathematics and computing, the Levenberg–Marquardt algorithm (LMA)  provides a \o "Numerical analysis" numerical solution to the problem of minimizing a function, generally nonlinear, over a space of parameters of the function. These minimization problems arise especially in \o "Least squares" least squares \o "Curve fitting" curve fitting and \o "Nonlinear programming" nonlinear programming.
The LMA interpolates between the \o "Gauss–Newton algorithm" Gauss–Newton algorithm (GNA) and the method of \o "Gradient descent" gradient descent. The LMA is more \o "Robustness (computer science)" robust than the GNA, which means th