Loading...
Decentralised convex optimisation with probability-proportional-to-size quantization
Pasechnyuk, D.A. ; Dvurechensky, P. ; Uribe, C.A. ; Gasnikov, A.V.
Pasechnyuk, D.A.
Dvurechensky, P.
Uribe, C.A.
Gasnikov, A.V.
Supervisor
Department
Machine Learning
Embargo End Date
Type
Journal article
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
Communication is one of the bottlenecks of distributed optimisation and learning. To overcome this bottleneck, we propose a novel quantization method that transforms a vector into a sample of components' indices drawn from a categorical distribution with probabilities proportional to values at those components. Then, we propose a primal and a primal-dual accelerated stochastic gradient methods that use our proposed quantization, and derive their convergence rates in terms of probabilities of large deviations. We focus on affine-constrained convex optimisation and its application to decentralised distributed optimisation problems. To illustrate the work of our algorithm, we apply it to the decentralised computation of semi-discrete entropy regularized Wasserstein barycentre's.
Citation
D. Pasechniuk, P. Dvurechensky, C. A. Uribe, and A. Gasnikov, “Decentralised convex optimisation with probability-proportional-to-size quantization,” EURO Journal on Computational Optimization, vol. 13, p. 100113, Jan. 2025, doi: 10.1016/J.EJCO.2025.100113
Source
EURO Journal on Computational Optimization
Conference
Keywords
Accelerated gradient method, Decentralised optimisation, Large deviation, Primal-dual algorithm, Quantization, Wasserstein barycentre
Subjects
Source
Publisher
Elsevier
