Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 2 von 9
Journal of optimization theory and applications, 2024-05, Vol.201 (2), p.539-582
2024

Details

Autor(en) / Beteiligte
Titel
Fast Multiobjective Gradient Methods with Nesterov Acceleration via Inertial Gradient-Like Systems
Ist Teil von
  • Journal of optimization theory and applications, 2024-05, Vol.201 (2), p.539-582
Ort / Verlag
New York: Springer US
Erscheinungsjahr
2024
Link zum Volltext
Quelle
SpringerLink (Online service)
Beschreibungen/Notizen
  • We derive efficient algorithms to compute weakly Pareto optimal solutions for smooth, convex and unconstrained multiobjective optimization problems in general Hilbert spaces. To this end, we define a novel inertial gradient-like dynamical system in the multiobjective setting, which trajectories converge weakly to Pareto optimal solutions. Discretization of this system yields an inertial multiobjective algorithm which generates sequences that converge weakly to Pareto optimal solutions. We employ Nesterov acceleration to define an algorithm with an improved convergence rate compared to the plain multiobjective steepest descent method (Algorithm 1). A further improvement in terms of efficiency is achieved by avoiding the solution of a quadratic subproblem to compute a common step direction for all objective functions, which is usually required in first-order methods. Using a different discretization of our inertial gradient-like dynamical system, we obtain an accelerated multiobjective gradient method that does not require the solution of a subproblem in each step (Algorithm 2). While this algorithm does not converge in general, it yields good results on test problems while being faster than standard steepest descent.

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX