论文标题

在联邦学习中,编码梯度聚合反对梯度泄漏

Encoded Gradients Aggregation against Gradient Leakage in Federated Learning

论文作者

Zeng, Dun, Liu, Shiyu, Liang, Siqi, Li, Zonghang, Wang, Hui, King, Irwin, Xu, Zenglin

论文摘要

联合学习使孤立的客户能够通过汇总本地计算的梯度更新来协作培训共享模型。但是,隐私信息可能会从上载梯度中泄漏,并可能会暴露于恶意攻击者或诚实而有趣的服务器中。尽管添加剂同态加密技术可以保证此过程的安全性,但它为FL参与者带来了不可接受的计算和通信负担。为了减轻安全聚合的成本并保持学习绩效,我们提出了一个称为编码梯度聚合(\ emph {ega})的新框架。详细介绍,EGA首先将本地梯度更新编码为编码域,并在服务器中的聚合之前在每个客户端中注入噪音。然后,可以通过解码函数恢复编码的梯度聚合结果。该方案可以防止单个客户端的原始梯度在Internet上公开,并使他们不知道服务器。 EGA可以在不同的噪声水平下提供优化和沟通优势,并防止梯度泄漏。我们进一步提供了近似误差及其对联合优化的影响的理论分析。此外,EGA与最联合的优化算法兼容。我们进行密集的实验,以评估现实世界中联合环境中的EGA,结果证明了其功效。

Federated learning enables isolated clients to train a shared model collaboratively by aggregating the locally-computed gradient updates. However, privacy information could be leaked from uploaded gradients and be exposed to malicious attackers or an honest-but-curious server. Although the additive homomorphic encryption technique guarantees the security of this process, it brings unacceptable computation and communication burdens to FL participants. To mitigate this cost of secure aggregation and maintain the learning performance, we propose a new framework called Encoded Gradient Aggregation (\emph{EGA}). In detail, EGA first encodes local gradient updates into an encoded domain with injected noises in each client before the aggregation in the server. Then, the encoded gradients aggregation results can be recovered for the global model update via a decoding function. This scheme could prevent the raw gradients of a single client from exposing on the internet and keep them unknown to the server. EGA could provide optimization and communication benefits under different noise levels and defend against gradient leakage. We further provide a theoretical analysis of the approximation error and its impacts on federated optimization. Moreover, EGA is compatible with the most federated optimization algorithms. We conduct intensive experiments to evaluate EGA in real-world federated settings, and the results have demonstrated its efficacy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源