论文标题
FEDX:无监督的联合学习,并进行交叉知识蒸馏
FedX: Unsupervised Federated Learning with Cross Knowledge Distillation
论文作者
论文摘要
本文介绍了无监督的联合学习框架FEDX。我们的模型从分散和异质的本地数据中学习无偏表示。它采用双面知识蒸馏,将对比度学习作为核心组件,从而使联合系统可以在不要求客户共享任何数据功能的情况下运行。此外,它的适应性体系结构可以用作联合设置中现有无监督算法的附加模块。实验表明,我们的模型可显着提高五种无监督算法的性能(1.58--5.52pp)。
This paper presents FedX, an unsupervised federated learning framework. Our model learns unbiased representation from decentralized and heterogeneous local data. It employs a two-sided knowledge distillation with contrastive learning as a core component, allowing the federated system to function without requiring clients to share any data features. Furthermore, its adaptable architecture can be used as an add-on module for existing unsupervised algorithms in federated settings. Experiments show that our model improves performance significantly (1.58--5.52pp) on five unsupervised algorithms.