liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A dense initialization for limited-memory quasi-Newton methods
University of California, Merced, CA, USA.
Linköping University, Department of Mathematics, Optimization . Linköping University, Faculty of Science & Engineering.ORCID iD: 0000-0003-1836-4200
Wake Forest University, Winston-Salem, NC, USA. (Department of Mathematics)
University of California, Merced, CA, USA. (Applied Mathematics)
2019 (English)In: Computational Optimization and Applications, ISSN 0926-6003, Vol. 74, no 1, p. 121-142Article in journal (Other academic) Published
Abstract [en]

We consider a family of dense initializations for limited-memory quasi-Newton methods. The proposed initialization exploits an eigendecomposition-based separation of the full space into two complementary subspaces, assigning a different initialization parameter to each subspace. This family of dense initializations is proposed in the context of a limited-memory Broyden–Fletcher–Goldfarb–Shanno (L-BFGS) trust-region method that makes use of a shape-changing norm to define each subproblem. As with L-BFGS methods that traditionally use diagonal initialization, the dense initialization and the sequence of generated quasi-Newton matrices are never explicitly formed. Numerical experiments on the CUTEst test set suggest that this initialization together with the shape-changing trust-region method outperforms other L-BFGS methods for solving general nonconvex unconstrained optimization problems. While this dense initialization is proposed in the context of a special trust-region method, it has broad applications for more general quasi-Newton trust-region and line search methods. In fact, this initialization is suitable for use with any quasi-Newton update that admits a compact representation and, in particular, any member of the Broyden class of updates.

Place, publisher, year, edition, pages
Springer, 2019. Vol. 74, no 1, p. 121-142
Keywords [en]
Large-scale nonlinear optimization, limited-memory quasi-Newton methods, trust-region methods, quasi-Newton matrices, shape-changing norm.
National Category
Computational Mathematics
Identifiers
URN: urn:nbn:se:liu:diva-143315DOI: 10.1007/s10589-019-00112-xISI: 000476600200005OAI: oai:DiVA.org:liu-143315DiVA, id: diva2:1162458
Note

Funding agencies: NSF [CMMI-1334042, CMMI-1333326, IIS-1741490, IIS-1741264]

Available from: 2017-12-04 Created: 2017-12-04 Last updated: 2019-08-12Bibliographically approved

Open Access in DiVA

A dense initialization for limited-memory quasi-Newton methods(570 kB)95 downloads
File information
File name FULLTEXT02.pdfFile size 570 kBChecksum SHA-512
e7ecf5a6c477409aef896fcdcc44905f3799a319d6353d984bc3597901786b967a2d535e257655236f67222899cdf4f52b41ca181060ebadd9962c1a5573509d
Type fulltextMimetype application/pdf

Other links

Publisher's full textLink to fullt text at Arxiv.org

Authority records

Burdakov, Oleg

Search in DiVA

By author/editor
Burdakov, Oleg
By organisation
Optimization Faculty of Science & Engineering
Computational Mathematics

Search outside of DiVA

GoogleGoogle Scholar
Total: 95 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 101 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf