论文标题

通过Moreau信封的个性化联合学习

Personalized Federated Learning with Moreau Envelopes

论文作者

Dinh, Canh T., Tran, Nguyen H., Nguyen, Tuan Dung

论文摘要

联合学习(FL)是一种分散且保护隐私的机器学习技术,其中一组客户与服务器合作以学习全局模型而无需共享客户的数据。与FL相关的一个挑战是客户之间的统计多样性,这限制了全球模型在每个客户的任务上提供良好的性能。为了解决这个问题,我们建议使用Moreau信封作为客户的正则损失功能,为个性化FL(PFEDME)提出了一种算法,该算法有助于将全球模型学习的个性化模型优化在Bi级模型学习中,以实现个性化FL的风格化。从理论上讲,我们表明PFEDME的收敛速率是最新的:实现二次加速度,以实现强烈凸的二次加速,并为平滑的非convex目标的订单2/3的订单加速2/3。在实验上,我们验证了PFEDME在基于元学习的个性化FL算法的Vanilla FedAvg和Per-Fedavg中相比,在经验性能方面表现出色。

Federated learning (FL) is a decentralized and privacy-preserving machine learning technique in which a group of clients collaborate with a server to learn a global model without sharing clients' data. One challenge associated with FL is statistical diversity among clients, which restricts the global model from delivering good performance on each client's task. To address this, we propose an algorithm for personalized FL (pFedMe) using Moreau envelopes as clients' regularized loss functions, which help decouple personalized model optimization from the global model learning in a bi-level problem stylized for personalized FL. Theoretically, we show that pFedMe's convergence rate is state-of-the-art: achieving quadratic speedup for strongly convex and sublinear speedup of order 2/3 for smooth nonconvex objectives. Experimentally, we verify that pFedMe excels at empirical performance compared with the vanilla FedAvg and Per-FedAvg, a meta-learning based personalized FL algorithm.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源