论文标题

通过凸聚类的个性化联合学习

Personalized Federated Learning via Convex Clustering

论文作者

Armacki, Aleksandar, Bajovic, Dragana, Jakovetic, Dusan, Kar, Soummya

论文摘要

我们建议使用本地凸出用户成本的个性化联合学习算法的参数系列。所提出的框架基于凸聚类的概括,其中不同用户模型之间的差异通过罚款总额惩罚,并由罚款参数$λ$加权。所提出的方法可以实现“自动”模型聚类,而无需事先了解隐藏的群集结构,也没有簇的数量。提供了重量参数的分析界限,这些界限提供了同时个性化,概括和自动模型聚类。通过在不同的群集中提供不同模型,通过提供与隔离计算的每个用户模型不同的模型,可以通过在不同群集中提供不同的模型来实现个性化。然后,我们根据乘数的并行方向方法(PDMM)提供有效的算法,以在联合服务器用户设置中求解所提出的公式。数值实验证实了我们的发现。作为一个有趣的副产品,我们的结果为凸聚类提供了几种概括。

We propose a parametric family of algorithms for personalized federated learning with locally convex user costs. The proposed framework is based on a generalization of convex clustering in which the differences between different users' models are penalized via a sum-of-norms penalty, weighted by a penalty parameter $λ$. The proposed approach enables "automatic" model clustering, without prior knowledge of the hidden cluster structure, nor the number of clusters. Analytical bounds on the weight parameter, that lead to simultaneous personalization, generalization and automatic model clustering are provided. The solution to the formulated problem enables personalization, by providing different models across different clusters, and generalization, by providing models different than the per-user models computed in isolation. We then provide an efficient algorithm based on the Parallel Direction Method of Multipliers (PDMM) to solve the proposed formulation in a federated server-users setting. Numerical experiments corroborate our findings. As an interesting byproduct, our results provide several generalizations to convex clustering.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源