论文标题

“真可爱!”:用于情感响应检测的护理数据集

"That's so cute!": The CARE Dataset for Affective Response Detection

论文作者

Dwivedi-Yu, Jane, Halevy, Alon Y.

论文摘要

社交媒体在我们与朋友和家人的沟通以及信息和娱乐的消费中起着越来越多的作用。因此,为了在社交媒体上设计有效的排名功能,预测对帖子的情感响应将是有用的(例如,用户是否有可能幽默,启发,激怒,知情)。类似于情感识别的工作(侧重于发行者的影响),识别情感反应的传统方法将涉及在培训数据注释的人类注释中进行昂贵的投资。 我们介绍了护理$ _ {db} $,这是一个使用常见情感响应表达式(CARE)方法根据7个情感响应注释的230k社交媒体帖子的数据集。护理方法是利用响应帖子发布的评论中存在的信号的手段,提供了有关读者对帖子没有人类注释的情感反应的高度精确证据。与人类注释不同,我们在这里描述的注释过程可以迭代以扩大方法的覆盖范围,尤其是对于新的情感反应。我们提出的实验表明,护理注释与众包注释相比有利。最后,我们使用Care $ _ {db} $来训练基于竞争性BERT的模型来预测情感响应和情感检测,并证明了数据集用于相关任务的实用性。

Social media plays an increasing role in our communication with friends and family, and our consumption of information and entertainment. Hence, to design effective ranking functions for posts on social media, it would be useful to predict the affective response to a post (e.g., whether the user is likely to be humored, inspired, angered, informed). Similar to work on emotion recognition (which focuses on the affect of the publisher of the post), the traditional approach to recognizing affective response would involve an expensive investment in human annotation of training data. We introduce CARE$_{db}$, a dataset of 230k social media posts annotated according to 7 affective responses using the Common Affective Response Expression (CARE) method. The CARE method is a means of leveraging the signal that is present in comments that are posted in response to a post, providing high-precision evidence about the affective response of the readers to the post without human annotation. Unlike human annotation, the annotation process we describe here can be iterated upon to expand the coverage of the method, particularly for new affective responses. We present experiments that demonstrate that the CARE annotations compare favorably with crowd-sourced annotations. Finally, we use CARE$_{db}$ to train competitive BERT-based models for predicting affective response as well as emotion detection, demonstrating the utility of the dataset for related tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源