论文标题

与非IID数据的ASR的联合学习解耦

Decoupled Federated Learning for ASR with Non-IID Data

论文作者

Zhu, Han, Wang, Jindong, Cheng, Gaofeng, Zhang, Pengyuan, Yan, Yonghong

论文摘要

具有联合学习(FL)的自动语音识别(ASR)使得在不损害隐私的情况下利用来自多个客户的数据。基于FL的ASR质量可以通过识别性能,沟通和计算成本来衡量。当不同客户之间的数据不是独立且分布相同的(非IID)时,性能可能会大大降低。在这项工作中,我们使用个性化的FL解决了基于FL的ASR中的非IID问题,该问题为每个客户学习个性化模型。具体而言,我们提出了两种类型的ASR个性化FL方法。首先,我们将基于个性化的FL适应ASR,这可以在本地保留一些层以学习个性化模型。其次,为了降低沟通和计算成本,我们提出了分离的联邦学习(Decouplefl)。一方面,DeCoupleFL将计算负担移至服务器,从而减少了客户端的计算。另一方面,DeCouplefl通信安全的高级功能而不是模型参数,从而在模型大时降低了通信成本。实验表明,与FedAvg相比,两种提出的基于FL的ASR方法可以将WER降低2.3%-3.4%。其中,与FedAvg相比,Decouplefl仅具有11.4%的通信和75%的计算成本,这也明显少于基于个性化的FL。

Automatic speech recognition (ASR) with federated learning (FL) makes it possible to leverage data from multiple clients without compromising privacy. The quality of FL-based ASR could be measured by recognition performance, communication and computation costs. When data among different clients are not independently and identically distributed (non-IID), the performance could degrade significantly. In this work, we tackle the non-IID issue in FL-based ASR with personalized FL, which learns personalized models for each client. Concretely, we propose two types of personalized FL approaches for ASR. Firstly, we adapt the personalization layer based FL for ASR, which keeps some layers locally to learn personalization models. Secondly, to reduce the communication and computation costs, we propose decoupled federated learning (DecoupleFL). On one hand, DecoupleFL moves the computation burden to the server, thus decreasing the computation on clients. On the other hand, DecoupleFL communicates secure high-level features instead of model parameters, thus reducing communication cost when models are large. Experiments demonstrate two proposed personalized FL-based ASR approaches could reduce WER by 2.3% - 3.4% compared with FedAvg. Among them, DecoupleFL has only 11.4% communication and 75% computation cost compared with FedAvg, which is also significantly less than the personalization layer based FL.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源