论文标题

联邦深入展开,以稀疏恢复

Federated Deep Unfolding for Sparse Recovery

论文作者

Mogilipalepu, Komal Krishna, Modukuri, Sumanth Kumar, Madapu, Amarlingam, Chepuri, Sundeep Prabhakar

论文摘要

本文提出了一种用于深层算法的联合学习技术,并应用了稀疏信号恢复和压缩感应的应用。我们将此架构称为Fed-CS。具体而言,我们展开并学习迭代收缩阈值算法,用于稀疏信号恢复,而无需运输到中心位置,培训数据分布在许多客户中。我们提出了一种按层的联合学习技术,每个客户使用本地数据来培训通用模型。然后,我们仅将该层的模型参数从所有客户端传输到服务器,该参数汇总了这些本地模型以达成共识模型。提议的层次联合学习以进行稀疏恢复是有效的,并保留了数据隐私。通过关于合成和真实数据集的数值实验,我们证明了Fed-CS的功效,并与参与客户的数量和沟通相比,与深入展开的集中式方法相比,涉及的参与客户和沟通的数量进行了权衡。

This paper proposes a federated learning technique for deep algorithm unfolding with applications to sparse signal recovery and compressed sensing. We refer to this architecture as Fed-CS. Specifically, we unfold and learn the iterative shrinkage thresholding algorithm for sparse signal recovery without transporting to a central location, the training data distributed across many clients. We propose a layer-wise federated learning technique, in which each client uses local data to train a common model. Then we transmit only the model parameters of that layer from all the clients to the server, which aggregates these local models to arrive at a consensus model. The proposed layer-wise federated learning for sparse recovery is communication efficient and preserves data privacy. Through numerical experiments on synthetic and real datasets, we demonstrate Fed-CS's efficacy and present various trade-offs in terms of the number of participating clients and communications involved compared to a centralized approach of deep unfolding.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源