论文标题

使用Multigraph Topology减少跨核联合学习的培训时间

Reducing Training Time in Cross-Silo Federated Learning using Multigraph Topology

论文作者

Do, Tuong, Nguyen, Binh X., Pham, Vuong, Tran, Toan, Tjiputra, Erman, Tran, Quang D., Nguyen, Anh

论文摘要

联合学习是一个积极的研究主题,因为它使几个参与者能够在不共享本地数据的情况下共同培训模型。当前,跨核心联合学习是一种流行的培训环境,它利用了几百个可靠的数据筒仓,并具有高速访问链接来培训模型。尽管这种方法已被广泛应用于现实世界中的情况,但设计一种可靠的拓扑以减少训练时间仍然是一个空旷的问题。在本文中,我们提出了一种用于跨核心联合学习的新型多编码拓扑。我们首先使用覆盖图构造多编码。然后,我们将此多数分析为具有孤立节点的不同简单图。孤立节点的存在使我们能够执行模型聚集而无需等待其他节点,从而有效地减少了训练时间。在三个公共数据集上进行的密集实验表明,与最近的最新拓扑相比,我们提出的方法大大减少了训练时间,同时保持学习模型的准确性。我们的代码可以在https://github.com/aioz-ai/multigraphfl上找到

Federated learning is an active research topic since it enables several participants to jointly train a model without sharing local data. Currently, cross-silo federated learning is a popular training setting that utilizes a few hundred reliable data silos with high-speed access links to training a model. While this approach has been widely applied in real-world scenarios, designing a robust topology to reduce the training time remains an open problem. In this paper, we present a new multigraph topology for cross-silo federated learning. We first construct the multigraph using the overlay graph. We then parse this multigraph into different simple graphs with isolated nodes. The existence of isolated nodes allows us to perform model aggregation without waiting for other nodes, hence effectively reducing the training time. Intensive experiments on three public datasets show that our proposed method significantly reduces the training time compared with recent state-of-the-art topologies while maintaining the accuracy of the learned model. Our code can be found at https://github.com/aioz-ai/MultigraphFL

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源