论文标题

所有带有一块石头的鸟类:多任务文本分类,可有效推断一个前向通行证

All Birds with One Stone: Multi-task Text Classification for Efficient Inference with One Forward Pass

论文作者

Huang, Jiaxin, Liu, Tianqi, Liu, Jialu, Lelkes, Adam D., Yu, Cong, Han, Jiawei

论文摘要

多任务学习(MTL)模型表明了它们的稳健性,有效性和效率,用于转移跨任务的知识。在实际的工业应用程序(例如Web内容分类)中,从相同的输入文本(例如Web文章)预测了多个分类任务。但是,在服务时间,现有的多任务变压器模型(例如基于提示或适配器的方法)需要对N任务进行N任务进行n nefort通过。为了解决这个问题,我们提出了一种可扩展的方法,该方法仅通过一个前向通行证即可实现更强的性能,接近O(1)计算成本。为了说明真实的应用程序用法,我们发布了有关新闻主题和样式分类的多任务数据集。我们的实验表明,我们提出的方法在胶水基准和新闻数据集上的表现都优于强大的基准。我们的代码和数据集可在https://bit.ly/mtop-code上公开获取。

Multi-Task Learning (MTL) models have shown their robustness, effectiveness, and efficiency for transferring learned knowledge across tasks. In real industrial applications such as web content classification, multiple classification tasks are predicted from the same input text such as a web article. However, at the serving time, the existing multitask transformer models such as prompt or adaptor based approaches need to conduct N forward passes for N tasks with O(N) computation cost. To tackle this problem, we propose a scalable method that can achieve stronger performance with close to O(1) computation cost via only one forward pass. To illustrate real application usage, we release a multitask dataset on news topic and style classification. Our experiments show that our proposed method outperforms strong baselines on both the GLUE benchmark and our news dataset. Our code and dataset are publicly available at https://bit.ly/mtop-code.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源