论文标题
通过分布式Stein变化梯度下降联合的广义贝叶斯学习
Federated Generalized Bayesian Learning via Distributed Stein Variational Gradient Descent
论文作者
论文摘要
本文介绍了分布式的Stein变分梯度下降(DSVGD),这是一种非参数广义贝叶斯推理框架,用于联合学习。 DSVGD在中央服务器上维护许多非随机和相互作用的粒子,以表示模型全局后验的当前迭代。粒子由一位代理商迭代下载和更新,最终目标是最大程度地减少全球自由能。通过改变粒子的数量,DSVGD可以在每卷通信负载和通信巡回赛数量之间进行灵活的权衡。 DSVGD被证明可以与基准的常见主义者和贝叶斯联合学习策略相比,还根据迭代的精度和可伸缩性来安排单个设备,同时还提供了良好的预测,因此也提供了良好的预测。
This paper introduces Distributed Stein Variational Gradient Descent (DSVGD), a non-parametric generalized Bayesian inference framework for federated learning. DSVGD maintains a number of non-random and interacting particles at a central server to represent the current iterate of the model global posterior. The particles are iteratively downloaded and updated by one of the agents with the end goal of minimizing the global free energy. By varying the number of particles, DSVGD enables a flexible trade-off between per-iteration communication load and number of communication rounds. DSVGD is shown to compare favorably to benchmark frequentist and Bayesian federated learning strategies, also scheduling a single device per iteration, in terms of accuracy and scalability with respect to the number of agents, while also providing well-calibrated, and hence trustworthy, predictions.