Incremental Without Replacement Sampling in Nonconvex Optimization - Intelligence Artificielle Access content directly
Journal Articles Journal of Optimization Theory and Applications Year : 2021

Incremental Without Replacement Sampling in Nonconvex Optimization

Abstract

Minibatch decomposition methods for empirical risk minimization are commonly analysed in a stochastic approximation setting, also known as sampling with replacement. On the other hands modern implementations of such techniques are incremental: they rely on sampling without replacement, for which available analysis are much scarcer. We provide convergence guaranties for the latter variant by analysing a versatile incremental gradient scheme. For this scheme, we consider constant, decreasing or adaptive step sizes. In the smooth setting we obtain explicit complexity estimates in terms of epoch counter. In the nonsmooth setting we prove that the sequence is attracted by solutions of optimality conditions of the problem.
Fichier principal
Vignette du fichier
incrementalNonsmoothNonconvex.pdf (350.83 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-02896102 , version 1 (10-07-2020)
hal-02896102 , version 2 (19-04-2021)
hal-02896102 , version 3 (15-06-2021)
hal-02896102 , version 4 (26-12-2022)

Identifiers

Cite

Edouard Pauwels. Incremental Without Replacement Sampling in Nonconvex Optimization. Journal of Optimization Theory and Applications, 2021, ⟨10.1007/s10957-021-01883-2⟩. ⟨hal-02896102v4⟩
192 View
202 Download

Altmetric

Share

Gmail Facebook X LinkedIn More