论文标题
基于变压器的多光值建模,用于多种多样多端口分析
Transformer-based Multi-Aspect Modeling for Multi-Aspect Multi-Sentiment Analysis
论文作者
论文摘要
基于方面的情感分析(ABSA)旨在分析句子中给定方面的情感。最近,基于神经网络的方法在现有的ABSA数据集中取得了有希望的结果。但是,这些数据集倾向于退化为句子级别的分析,因为大多数句子仅包含一个具有相同情感极性的方面或多个方面。为了促进ABSA的研究,NLPCC 2020共享任务2释放了新的大规模多种多样多态度(MAMS)数据集。在MAMS数据集中,每个句子至少包含两个不同情感极性的不同方面,这使Absa更加复杂和具有挑战性。为了解决具有挑战性的数据集,我们将ABSA重新叠加为多观点分析的问题,并提出了一种新型的基于变形金刚的多相关建模方案(TMM),该方案可以捕获多个方面的潜在关系并同时检测句子中所有方面的情感。 MAMS数据集的实验结果表明,与Bert和Roberta等强基础相比,我们的方法取得了明显的改进,并最终将NLPCC 2020 2020中的第二名对共享任务2评估进行了排名。
Aspect-based sentiment analysis (ABSA) aims at analyzing the sentiment of a given aspect in a sentence. Recently, neural network-based methods have achieved promising results in existing ABSA datasets. However, these datasets tend to degenerate to sentence-level sentiment analysis because most sentences contain only one aspect or multiple aspects with the same sentiment polarity. To facilitate the research of ABSA, NLPCC 2020 Shared Task 2 releases a new large-scale Multi-Aspect Multi-Sentiment (MAMS) dataset. In the MAMS dataset, each sentence contains at least two different aspects with different sentiment polarities, which makes ABSA more complex and challenging. To address the challenging dataset, we re-formalize ABSA as a problem of multi-aspect sentiment analysis, and propose a novel Transformer-based Multi-aspect Modeling scheme (TMM), which can capture potential relations between multiple aspects and simultaneously detect the sentiment of all aspects in a sentence. Experiment results on the MAMS dataset show that our method achieves noticeable improvements compared with strong baselines such as BERT and RoBERTa, and finally ranks the 2nd in NLPCC 2020 Shared Task 2 Evaluation.