论文标题

持续学习自然语言处理任务:调查

Continual Learning of Natural Language Processing Tasks: A Survey

论文作者

Ke, Zixuan, Liu, Bing

论文摘要

持续学习(CL)是一种学习范式,它可以模仿人类学习和累积知识的能力,而无需忘记以前学习的知识,也可以传递学习的知识,以帮助更好地学习新任务。这项调查对NLP中CL的最新进展进行了全面的综述和分析,在计算机视觉和机器学习中,CL与CL有显着差异。它涵盖了(1)所有CL设置,并具有现有技术的分类法; (2)灾难性遗忘(CF)预防,(3)知识转移(KT),这对于NLP任务尤为重要; (4)一些理论和任务间分离(ICS)的隐藏挑战。 (1),(3)和(4)尚未包含在现有调查中。最后,讨论了未来方向的清单。

Continual learning (CL) is a learning paradigm that emulates the human capability of learning and accumulating knowledge continually without forgetting the previously learned knowledge and also transferring the learned knowledge to help learn new tasks better. This survey presents a comprehensive review and analysis of the recent progress of CL in NLP, which has significant differences from CL in computer vision and machine learning. It covers (1) all CL settings with a taxonomy of existing techniques; (2) catastrophic forgetting (CF) prevention, (3) knowledge transfer (KT), which is particularly important for NLP tasks; and (4) some theory and the hidden challenge of inter-task class separation (ICS). (1), (3) and (4) have not been included in the existing survey. Finally, a list of future directions is discussed.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源