论文标题

联合选择性聚合以进行知识合并

Federated Selective Aggregation for Knowledge Amalgamation

论文作者

Xie, Donglin, Yu, Ruonan, Fang, Gongfan, Song, Jie, Feng, Zunlei, Wang, Xinchao, Sun, Li, Song, Mingli

论文摘要

在本文中,我们探讨了一个新的知识障碍问题,称为联合选择性聚合(FEDSA)。 FEDSA的目的是在几位分散的教师的帮助下培训学生模型完成新任务,他们的预培训任务和数据不同且不可知。我们调查此类问题设置的动机源于最近的模型共享困境。许多研究人员或机构已经在培训大型且称职的网络上花费了巨大的资源。但是,由于隐私,安全或知识产权问题,即使他们希望为社区做出贡献,他们也无法分享自己的预培训模型。拟议的FEDSA提供了解决这一困境的解决方案,并使其更进一步,因为学识渊博的学生可以专门从事与所有老师不同的新任务。为此,我们提出了一种处理FEDSA的专门战略。具体而言,我们的学生培训过程是由一种新颖的基于显着性的方法驱动的,该方法可适应地选择教师作为参与者,并将其代表性能力整合到学生中。为了评估FEDSA的有效性,我们对单任务和多任务设置进行了实验。实验结果表明,FEDSA有效地将分散模型的知识融合在一起,并将竞争性能达到集中式基准。

In this paper, we explore a new knowledge-amalgamation problem, termed Federated Selective Aggregation (FedSA). The goal of FedSA is to train a student model for a new task with the help of several decentralized teachers, whose pre-training tasks and data are different and agnostic. Our motivation for investigating such a problem setup stems from a recent dilemma of model sharing. Many researchers or institutes have spent enormous resources on training large and competent networks. Due to the privacy, security, or intellectual property issues, they are, however, not able to share their own pre-trained models, even if they wish to contribute to the community. The proposed FedSA offers a solution to this dilemma and makes it one step further since, again, the learned student may specialize in a new task different from all of the teachers. To this end, we proposed a dedicated strategy for handling FedSA. Specifically, our student-training process is driven by a novel saliency-based approach that adaptively selects teachers as the participants and integrates their representative capabilities into the student. To evaluate the effectiveness of FedSA, we conduct experiments on both single-task and multi-task settings. Experimental results demonstrate that FedSA effectively amalgamates knowledge from decentralized models and achieves competitive performance to centralized baselines.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源