CPFedAvg: Enhancing Hierarchical Federated Learning via Optimized Local Aggregation and Parameter Mixing
Liu, Xuezheng ; Zhou, Yipeng ; Wu, Di ; Hu, Miao ; Chen, Min ; Guizani, Mohsen ; Sheng, Quan Z.
Liu, Xuezheng
Zhou, Yipeng
Wu, Di
Hu, Miao
Chen, Min
Guizani, Mohsen
Sheng, Quan Z.
Supervisor
Department
Machine Learning
Embargo End Date
Type
Journal article
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
Hierarchical federated learning (HFL) improves the scalability and efficiency of traditional federated learning (FL) by incorporating a hierarchical topology into the FL framework. In a typical HFL system, clients are divided into multiple tiers, and the training process involves both local and global model aggregation. However, existing HFL approaches have several significant drawbacks. Firstly, the root parameter server (PS) is vulnerable to single-point failure and also acts as a bottleneck for global aggregation. Additionally, frequent global aggregation over the wide area network (WAN) incurs substantial communication costs, which negatively affect training efficiency. In this paper, we propose a novel HFL algorithm called CPFedAvg to address the aforementioned challenges. CPFedAvg introduces a root-free hierarchical topology, where the top tier consists of multiple PSes, effectively resolving the issues associated with the root PS. Additionally, we substitute the expensive global aggregation with parameter mixing operations between the PSes in the top tier. We analyze the convergence rate of CPFedAvg under non-convex loss. Based on this analysis, we formulate a convex optimization problem to optimize the frequency of executing local aggregations between consecutive parameter mixing operations. To simulate real-world communication networks, we develop FedNetSimulator to simulate a diverse range of FL communication processes. Finally, we conduct extensive experiments using real datasets (i.e., CIFAR-10 and CIFAR-100). The experimental results demonstrate that CPFedAvg can improve model accuracy by up to 18% and the speedup can be as high as 6 compared with the state-of-the-art baselines.
Citation
X. Liu et al., "CPFedAvg: Enhancing Hierarchical Federated Learning via Optimized Local Aggregation and Parameter Mixing," in IEEE Transactions on Networking, doi: 10.1109/TON.2025.3526866
Source
IEEE Transactions on Networking
Conference
Keywords
Topology, Convergence, Training, Servers, Federated learning, Costs, Wide area networks, Scalability, Data models, Clustering algorithms
Subjects
Source
Publisher
IEEE
