Yuhong Dai - State Key Laboratory of Scientific and Engineering Computing, Institute of Computational Mathematics and Scientific/Engineering Computing, P.O. Box 2719, Beijing 100080, China (email)
In this paper, we briefly review the
extensions of quasi-Newton methods for large-scale optimization.
Specially, based on the idea of maximum determinant positive
definite matrix completion, Yamashita (2008) proposed a new sparse
quasi-Newton update, called MCQN, for unconstrained optimization
problems with sparse Hessian structures. In exchange of the
relaxation of the secant equation, the MCQN update avoids solving
difficult subproblems and overcomes the ill-conditioning of
approximate Hessian matrices. A global convergence analysis is
given in this paper for the MCQN update with Broyden's convex family
assuming that the objective function is uniformly convex and its
dimension is only two.
Keywords: Quasi-Newton method,
large-scale optimization, sparsity, positive definite matrix
completion, global convergence.
Received: September 2010; Revised: September 2010; Available Online: February 2011.