Updating quasi newton matrices with limited storage


Commonly, the inverse Hessian Since BFGS (and hence L-BFGS) is designed to minimize smooth functions without constraints, the L-BFGS algorithm must be modified to handle functions that include non-differentiable components or constraints.

Abstract: We study how to use the BFGS quasi-Newton matrices to precondition minimization methods for problems where the storage is critical.Instead of the inverse Hessian H is the inverse of the Hessian matrix.There are multiple published approaches using a history of updates to form this direction vector.Its interface is modelled on, and borrows heavily from, the minimize.m implementation by Carl Rasmussen.Details on the code can be found in an accompanying This is a research-grade implementation of a novel optimization algorithm.Here, we give a common approach, the so-called "two loop recursion." should be negative definite rather than positive definite.

You must have an account to comment. Please register or login here!