home · mobile · calendar · defenses · 2006-2007 · 

Thesis Defense - Jiang

Preconditioning the Limited-Memory BFGS Algorithm
Computer Science PhD Candidate
8/25/2006
1:00pm-3:00pm

For large scale unconstrained optimization, truncated Newton methods and limited memory quasi-Newton methods are most effective choices. Often, the truncated Newton method adopts a preconditioner to speed up the convergence. However, very little previous study has explored the possibility of using a preconditioner in the limited memory quasi-Newton method. In this thesis, we focus on the LBFGS method, which belongs to the limited memory quasi-Newton methods, and we implement and study this preconditioned LBFGS algorithm (PLBFGS).

We did experiments on protein structure prediction problems with three kinds of preconditioners constructed from the protein potential energy function. We also did experiments on a set of general test problems using a general preconditioner, the incomplete Cholesky factorization preconditioner. The results show that PLBFGS method has significant performance improvement over LBFGS algorithm by using the preconditioners, and performs competitively with the preconditioned truncated Newton method.

Committee: Richard Byrd, Professor (Chair)
Robert (Bobby) Schnabel, Professor
Xiao-Chuan Cai, Professor
Elizabeth Jessup, Professor
Thomas Manteuffel, Department of Applied Mathematics
Department of Computer Science
University of Colorado Boulder
Boulder, CO 80309-0430 USA
webmaster@cs.colorado.edu
www.cs.colorado.edu
May 5, 2012 (14:20)
XHTML 1.0/CSS2
©2012