Loading...
Thumbnail Image
Item

Exploring Jacobian Inexactness in Second-Order Methods for Variational Inequalities: Lower Bounds, Optimal Algorithms and Quasi-Newton Approximations

Agafonov, Artem
Ostroukhov, Petr
Mozhaev, Roman
Yakovlev, Konstantin
Gorbunov, Eduard
Takac, Martin
Gasnikov, Alexander
Kamzolov, Dmitry
Research Projects
Organizational Units
Journal Issue
Abstract
Variational inequalities represent a broad class of problems, including minimization and min-max problems, commonly found in machine learning. Existing second-order and high-order methods for variational inequalities require precise computation of derivatives, often resulting in prohibitively high iteration costs. In this work, we study the impact of Jacobian inaccuracy on second-order methods. For the smooth and monotone case, we establish a lower bound with explicit dependence on the level of Jacobian inaccuracy and propose an optimal algorithm for this key setting. When derivatives are exact, our method converges at the same rate as exact optimal second-order methods. To reduce the cost of solving the auxiliary problem, which arises in all high-order methods with global convergence, we introduce several Quasi-Newton approximations. Our method with Quasi-Newton updates achieves a global sublinear convergence rate. We extend our approach with a tensor generalization for inexact high-order derivatives and support the theory with experiments.
Citation
A. Agafonov et al., “Exploring Jacobian Inexactness in Second-Order Methods for Variational Inequalities: Lower Bounds, Optimal Algorithms and Quasi-Newton Approximations,” Adv Neural Inf Process Syst, vol. 37, pp. 115816–115860, Dec. 2024.
Source
Advances in Neural Information Processing Systems (NeurIPS 2024)
Conference
Keywords
Variational inequalities, Second-order methods, Jacobian inaccuracy, Lower bounds, Quasi-Newton approximations
Subjects
Source
Publisher
NEURIPS
DOI
Full-text link