Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 8 von 134
Optimization letters, 2023-07, Vol.17 (6), p.1413-1434
2023

Details

Autor(en) / Beteiligte
Titel
A gradient method exploiting the two dimensional quadratic termination property
Ist Teil von
  • Optimization letters, 2023-07, Vol.17 (6), p.1413-1434
Ort / Verlag
Berlin/Heidelberg: Springer Berlin Heidelberg
Erscheinungsjahr
2023
Link zum Volltext
Quelle
SpringerLink (Online service)
Beschreibungen/Notizen
  • The quadratic termination property is important to the efficiency of gradient methods. We consider equipping a family of gradient methods, where the stepsize is given by the ratio of two norms, with two dimensional quadratic termination. Such a desired property is achieved by cooperating with a new stepsize which is derived by maximizing the stepsize of the considered family in the next iteration. It is proved that each method in the family will asymptotically alternate in a two dimensional subspace spanned by the eigenvectors corresponding to the largest and smallest eigenvalues. Based on this asymptotic behavior, we show that the new stepsize converges to the reciprocal of the largest eigenvalue of the Hessian. Furthermore, by adaptively taking the long Barzilai–Borwein stepsize and reusing the new stepsize with retard, we propose an efficient gradient method for unconstrained quadratic optimization. We prove that the new method is R -linearly convergent with a rate of 1 - 1 / κ , where κ is the condition number of Hessian. Numerical experiments show the efficiency of our proposed method.
Sprache
Englisch
Identifikatoren
ISSN: 1862-4472
eISSN: 1862-4480
DOI: 10.1007/s11590-022-01936-z
Titel-ID: cdi_crossref_primary_10_1007_s11590_022_01936_z

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX