Loading...
Thumbnail Image
Item

Quantize Once, Train Fast: Allreduce-Compatible Compression with Provable Guarantees

Xin, Jihao
Canini, Marco
Richtárik, Peter
Horváth, Samuel
Supervisor
Department
Machine Learning
Embargo End Date
Type
Conference proceeding
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
Distributed training enables large-scale deep learning, but suffers from high communication overhead, especially as models and datasets grow. Gradient compression, particularly quantization, is a promising approach to mitigate this bottleneck. However, existing quantization schemes are often incompatible with Allreduce, the dominant communication primitive in distributed deep learning, and many prior solutions rely on heuristics without theoretical guarantees. We introduce Global-QSGD, an Allreduce-compatible gradient quantization method that leverages global norm scaling to reduce communication overhead while preserving accuracy. Global-QSGD is backed by rigorous theoretical analysis, extending standard unbiased compressor frameworks to establish formal convergence guarantees. Additionally, we develop a performance model to evaluate its impact across different hardware configurations. Extensive experiments on NVLink, PCIe, and large-scale cloud environments show that Global-QSGD accelerates distributed training by up to 3.51× over baseline quantization methods, making it a practical and efficient solution for large-scale deep learning workloads.
Citation
J. Xin, M. Canini, P. Richtárik, and S. Horváth, “Quantize Once, Train Fast: Allreduce-Compatible Compression with Provable Guarantees,” pp. 2658–2665, Oct. 2025, doi: 10.3233/FAIA251118
Source
Frontiers in Artificial Intelligence and Applications
Conference
28th European Conference on Artificial Intelligence, ECAI 2025, including 14th Conference on Prestigious Applications of Intelligent Systems, PAIS 2025
Keywords
Subjects
Source
28th European Conference on Artificial Intelligence, ECAI 2025, including 14th Conference on Prestigious Applications of Intelligent Systems, PAIS 2025
Publisher
IOS Press
Full-text link