论文标题
通过全球动量融合来提高联合学习沟通效率,以进行梯度压缩方案
Improving Federated Learning Communication Efficiency with Global Momentum Fusion for Gradient Compression Schemes
论文作者
论文摘要
联邦学习中的沟通成本阻碍了系统可扩展性,从而从更多的客户那里获取了更多数据。拟议的FL采用了轮毂和辐条网络拓扑。所有客户都通过中央服务器进行通信。因此,已经提出了通过诸如数据压缩的技术来减少此问题的技术。联邦学习的另一个挑战是数据分布不平衡,每个客户端的数据在典型的联合学习环境中并非独立且分布相同(非IID)。在本文中,我们提出了一种称为全球动量融合(GMF)的新压缩补偿方案,该方案降低了FL客户端与服务器之间的通信开销,并在存在非IID数据的情况下保持了可比的模型准确性。 GitHub存储库:https://github.com/tony92151/global-momentum-fusion-fusion-fl
Communication costs within Federated learning hinder the system scalability for reaching more data from more clients. The proposed FL adopts a hub-and-spoke network topology. All clients communicate through the central server. Hence, reducing communication overheads via techniques such as data compression has been proposed to mitigate this issue. Another challenge of federated learning is unbalanced data distribution, data on each client are not independent and identically distributed (non-IID) in a typical federated learning setting. In this paper, we proposed a new compression compensation scheme called Global Momentum Fusion (GMF) which reduces communication overheads between FL clients and the server and maintains comparable model accuracy in the presence of non-IID data. GitHub repository: https://github.com/tony92151/global-momentum-fusion-fl