First-Order Objective-Function-Free Optimization Algorithms and Their Complexity - Algorithmes Parallèles et Optimisation Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2022

First-Order Objective-Function-Free Optimization Algorithms and Their Complexity

Résumé

A class of algorithms for unconstrained nonconvex optimization is considered where the value of the objective function is never computed. The class contains a deterministic version of the first-order Adagrad method typically used for minimization of noisy function, but also allows the use of second-order information when available. The rate of convergence of methods in the class is analyzed and is shown to be identical to that known for first-order optimization methods using both function and gradients values. The result is essentially sharp and improves on previously known complexity bounds (in the stochastic context) by Defossez et al. (2020) and Gratton et al. (2022). A new class of methods is designed, for which a slightly worse and essentially sharp complexity result holds. Limited numerical experiments show that the new methods' performance may be comparable to that of standard steepest descent, despite using significantly less information, and that this performance is relatively insensitive to noise.

Dates et versions

hal-03718811 , version 1 (09-07-2022)

Licence

Paternité

Identifiants

Citer

Serge Gratton, Sadok Jerad, Philippe L. Toint. First-Order Objective-Function-Free Optimization Algorithms and Their Complexity. 2022. ⟨hal-03718811⟩
56 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More