论文标题
在澳大利亚2019 - 2020年丛林大火期间促进和反对错误信息:两极分化的案例研究
Promoting and countering misinformation during Australia's 2019-2020 bushfires: A case study of polarisation
论文作者
论文摘要
在2019 - 2020年澳大利亚空前的丛林大火中,错误的信息指责纵火在Twitter上使用#ArsonEmergency在Twitter上浮出水面。机器人负责传播和扩大这种错误信息的程度在媒体和学术研究中受到了审查。在这里,我们研究Twitter社区在人群级别的活动中传播这种错误信息,并研究在线社区和机器人的作用。我们对讨论动态的深入研究采用了分阶段的方法 - 在报告促进主题标签的机器人之前和之后,主流媒体播放了促进主题标签的机器人。尽管我们没有发现很多机器人,但最类似机器人的说法是社交机器人,它们是真正的人类。此外,我们在Twitter讨论中将两个两极分化社区之间的有意义的定量差异提炼出来,从而产生以下见解。首先,纵火叙事的支持者通过直接与其他人的答复和提及使用主题标签和指向外部来源的链接来促进了错误信息。作为回应,反对者转发了基于事实的文章和官方信息。其次,支持者在整个相互作用网络中都被嵌入,但是尽管他们的外围位置,反对者的中心性更高。到最后阶段,对手和非附属帐户似乎是协调的,有可能吸引更广泛的受众。最后,在第一阶段共享支持者的比率为9:1的比率为9:1,与支持者的反对者相同,在第一阶段共享了支持者URL。这挫败了支持者的努力,强调了暴露了错误信息运动的价值。我们推测,此处观察到的沟通策略可以在其他与信息有关的讨论中发现,并且可以告知反设计。
During Australia's unprecedented bushfires in 2019-2020, misinformation blaming arson resurfaced on Twitter using #ArsonEmergency. The extent to which bots were responsible for disseminating and amplifying this misinformation has received scrutiny in the media and academic research. Here we study Twitter communities spreading this misinformation during the population-level event, and investigate the role of online communities and bots. Our in-depth investigation of the dynamics of the discussion uses a phased approach -- before and after reporting of bots promoting the hashtag was broadcast by the mainstream media. Though we did not find many bots, the most bot-like accounts were social bots, which present as genuine humans. Further, we distilled meaningful quantitative differences between two polarised communities in the Twitter discussion, resulting in the following insights. First, Supporters of the arson narrative promoted misinformation by engaging others directly with replies and mentions using hashtags and links to external sources. In response, Opposers retweeted fact-based articles and official information. Second, Supporters were embedded throughout their interaction networks, but Opposers obtained high centrality more efficiently despite their peripheral positions. By the last phase, Opposers and unaffiliated accounts appeared to coordinate, potentially reaching a broader audience. Finally, unaffiliated accounts shared the same URLs as Opposers over Supporters by a ratio of 9:1 in the last phase, having shared mostly Supporter URLs in the first phase. This foiled Supporters' efforts, highlighting the value of exposing misinformation campaigns. We speculate that the communication strategies observed here could be discoverable in other misinformation-related discussions and could inform counter-strategies.