Accelerated Adaptive Cubic Regularized Quasi-Newton Methods
Kamzolov, Dmitry I. ; Ziu, Klea ; Agafonov, Artem D. ; Takáč, Martin
Kamzolov, Dmitry I.
Ziu, Klea
Agafonov, Artem D.
Takáč, Martin
Author
Supervisor
Department
Machine Learning
Embargo End Date
Type
Journal article
Date
2026
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
In this paper, we propose the first Quasi-Newton method with a global convergence rate of O(k-1) for general convex functions. Quasi-Newton methods, such as BFGS and SR-1, are well-known for their impressive practical performance. However, they are theoretically slower than gradient descent for general convex functions. This gap between impressive practical performance and poor theoretical guarantees was an open question for a long period of time. In this paper, we make a significant step to close this gap. We improve upon the existing rate and propose the Cubic Regularized Quasi-Newton Method with a convergence rate of O(k-1). The key to achieving this improvement is to use the Cubic Regularized Newton Method over the Damped Newton Method as an outer method, where the Quasi-Newton update is an inexact Hessian approximation. Using this approach, we propose the first Accelerated Quasi-Newton method with a global convergence rate of O(k-2) for general convex functions. In special cases where we have access to additional computations, for example, Hessian-vector products, we can improve the inexact Hessian approximation and achieve a global convergence rate of O(k-3), which makes it intermediate second-order method. To make these methods practical, we introduce the Adaptive Inexact Cubic Regularized Newton Method and its accelerated version, which provide real-time control of the approximation error. We show that the proposed methods have impressive practical performance and outperform both first and second-order methods.
Citation
D. Kamzolov, K. Ziu, A. Agafonov, and M. Takáč, “Accelerated Adaptive Cubic Regularized Quasi-Newton Methods,” J Optim Theory Appl, vol. 208, no. 1, pp. 1–46, Jan. 2026, doi: 10.1007/S10957-025-02804-3
Source
Journal of Optimization Theory and Applications
Conference
Keywords
BFGS, Cubic Newton Method, High-order methods, Optimization, Quasi-Newton Methods, Second-order methods
Subjects
Source
Publisher
Springer Nature
