论文标题

基于自我注意的BIGRU和胶囊网络,用于指定实体识别

Self-attention-based BiGRU and capsule network for named entity recognition

论文作者

Deng, Jianfeng, Cheng, Lianglun, Wang, Zhuowei

论文摘要

命名实体识别(NER)是自然语言处理(NLP)的任务之一。鉴于传统角色表示能力弱的问题,而神经网络方法无法捕获重要的序列信息。提出了基于自我注意的双向门控复发单元(BIGRU)和胶囊网络(CAPSNET)。该模型通过变压器(BERT)预训练模型的双向编码器表示生成字符矢量。 Bigru用于捕获序列上下文特征,并提出了自我发挥的机制,以不同于Bigru隐藏层捕获的信息。最后,我们建议将CAPSNET用于实体识别。我们在两个数据集上评估了该模型的识别性能。实验结果表明,该模型在不依赖外部字典信息的情况下具有更好的性能。

Named entity recognition(NER) is one of the tasks of natural language processing(NLP). In view of the problem that the traditional character representation ability is weak and the neural network method is unable to capture the important sequence information. An self-attention-based bidirectional gated recurrent unit(BiGRU) and capsule network(CapsNet) for NER is proposed. This model generates character vectors through bidirectional encoder representation of transformers(BERT) pre-trained model. BiGRU is used to capture sequence context features, and self-attention mechanism is proposed to give different focus on the information captured by hidden layer of BiGRU. Finally, we propose to use CapsNet for entity recognition. We evaluated the recognition performance of the model on two datasets. Experimental results show that the model has better performance without relying on external dictionary information.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源