Intermediate Gradient Methods with Relative Inexactness
Kornilov, Nikita ; Alkousa, Mohammad ; Gorbunov, Eduard ; Stonyakin, Fedor ; Dvurechensky, Pavel ; Gasnikov, Alexander
Kornilov, Nikita
Alkousa, Mohammad
Gorbunov, Eduard
Stonyakin, Fedor
Dvurechensky, Pavel
Gasnikov, Alexander
Supervisor
Department
Statistics and Data Science
Embargo End Date
Type
Journal article
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
This paper is devoted to studying first-order methods for smooth convex optimization with inexact gradients. Unlike the majority of the literature on this topic, we consider the setting of relative rather than absolute inexactness. More precisely, we assume that the additive error in the gradient is proportional to the gradient norm, rather than being globally bounded by some small quantity. We propose a novel analysis of the accelerated gradient method under relative inexactness and strong convexity, improving the bound on the maximum admissible error that preserves the algorithm’s linear convergence. In other words, we analyze the robustness of the accelerated gradient method to relative gradient inexactness. Furthermore, using the Performance Estimation Problem (PEP) technique, we demonstrate that the obtained result is tight up to a numerical constant. Motivated by existing intermediate methods with absolute error, i.e., methods which convergence rates interpolate between the slower but more robust non-accelerated algorithms and the faster yet less robust accelerated algorithms, we propose an adaptive variant of the intermediate gradient method with relative gradient error.
Citation
N. Kornilov, M. Alkousa, E. Gorbunov, F. Stonyakin, P. Dvurechensky, and A. Gasnikov, “Intermediate Gradient Methods with Relative Inexactness,” Journal of Optimization Theory and Applications 2025 207:3, vol. 207, no. 3, pp. 1–42, Aug. 2025, doi: 10.1007/S10957-025-02809-Y
Source
Journal of Optimization Theory and Applications
Conference
Keywords
Accelerated Methods, Inexact Gradient, Intermediate Method, Performance Estimation Problem, Relative Noise, Errors, Gradient Methods, Accelerated Method, Estimation Problem, First-order Methods, Gradient's Methods, Inexact Gradient, Intermediate Method, Performance Estimation, Performance Estimation Problem, Relative Gradient, Relative Noise, Convex Optimization
Subjects
Source
Publisher
Springer Nature
