论文标题

多CLS Bert:传统结合的有效替代品

Multi-CLS BERT: An Efficient Alternative to Traditional Ensembling

论文作者

Chang, Haw-Shiuan, Sun, Ruei-Yao, Ricci, Kathryn, McCallum, Andrew

论文摘要

结合BERT模型通常会显着提高准确性,但要取得更多的计算和内存足迹。在这项工作中,我们提出了Multi-CLS Bert,这是一种用于基于CLS的预测任务的新型结合方法,几乎​​与单个BERT模型一样有效。 Multi-CLS Bert使用多个CLS代币,具有鼓励其多样性的参数化和目标。因此,我们只需要微调单个多CLS BERT模型(并在测试时间运行一个模型,仅在测试时运行一个模型,而是在测试时运行一个模型,而是在测试时间运行一个模型,而不是在整个集合中微调每个BERT模型(并在测试时运行它们)。为了测试其有效性,我们在BERT的最先进的预处理方法上构建了多CLS Bert(Aroca-Ouellette和Rudzicz,2020)。在有关胶水和超粘液的实验中,我们表明我们的多CLS Bert可靠地提高了整体准确性和置信度估计。当仅100个训练样本中可用时,多CLS BERT_BASE模型甚至可以优于相应的BERT_LARGE模型。我们分析了多CLS BERT的行为,表明它具有许多相同的特征和行为,与典型的BERT 5-tay合奏相同,但计算和记忆少了近4倍。

Ensembling BERT models often significantly improves accuracy, but at the cost of significantly more computation and memory footprint. In this work, we propose Multi-CLS BERT, a novel ensembling method for CLS-based prediction tasks that is almost as efficient as a single BERT model. Multi-CLS BERT uses multiple CLS tokens with a parameterization and objective that encourages their diversity. Thus instead of fine-tuning each BERT model in an ensemble (and running them all at test time), we need only fine-tune our single Multi-CLS BERT model (and run the one model at test time, ensembling just the multiple final CLS embeddings). To test its effectiveness, we build Multi-CLS BERT on top of a state-of-the-art pretraining method for BERT (Aroca-Ouellette and Rudzicz, 2020). In experiments on GLUE and SuperGLUE we show that our Multi-CLS BERT reliably improves both overall accuracy and confidence estimation. When only 100 training samples are available in GLUE, the Multi-CLS BERT_Base model can even outperform the corresponding BERT_Large model. We analyze the behavior of our Multi-CLS BERT, showing that it has many of the same characteristics and behavior as a typical BERT 5-way ensemble, but with nearly 4-times less computation and memory.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源