论文标题
具有多块链框架
Scalable and Communication-efficient Decentralized Federated Edge Learning with Multi-blockchain Framework
论文作者
论文摘要
新兴联合边缘学习(FEL)技术引起了人们的关注,这不仅确保了良好的机器学习绩效,还解决了由于数据隐私问题引起的“数据岛”问题。但是,大规模的FEL仍面临着关键的挑战:(i)缺乏针对FEL的安全且沟通高效的模型培训计划; (2)没有用于更新本地模型和全球模型共享(交易)管理的可扩展且灵活的FEL框架。为了弥合差距,我们首先提出了一个由区块链授权的安全FEL系统,其分层区块链框架由主链和子链组成。该框架可以通过分别管理本地模型更新或模型共享记录来实现可扩展和灵活的分散FEL。然后,设计了验证的共识方案,以删除低质量的模型更新并以分散和安全的方式管理合格的模型更新,从而实现了安全的FEL。为了提高区块链授权FEL的沟通效率,梯度压缩方案旨在产生稀疏但重要的梯度,以减少沟通开销而不损害准确性,并进一步加强对培训数据的隐私保护。安全分析和数值结果表明,所提出的方案可以实现安全,可扩展和沟通效率的分散FEL。
The emerging Federated Edge Learning (FEL) technique has drawn considerable attention, which not only ensures good machine learning performance but also solves "data island" problems caused by data privacy concerns. However, large-scale FEL still faces following crucial challenges: (i) there lacks a secure and communication-efficient model training scheme for FEL; (2) there is no scalable and flexible FEL framework for updating local models and global model sharing (trading) management. To bridge the gaps, we first propose a blockchain-empowered secure FEL system with a hierarchical blockchain framework consisting of a main chain and subchains. This framework can achieve scalable and flexible decentralized FEL by individually manage local model updates or model sharing records for performance isolation. A Proof-of-Verifying consensus scheme is then designed to remove low-quality model updates and manage qualified model updates in a decentralized and secure manner, thereby achieving secure FEL. To improve communication efficiency of the blockchain-empowered FEL, a gradient compression scheme is designed to generate sparse but important gradients to reduce communication overhead without compromising accuracy, and also further strengthen privacy preservation of training data. The security analysis and numerical results indicate that the proposed schemes can achieve secure, scalable, and communication-efficient decentralized FEL.