论文标题
tan-ntm:神经主题建模的主题注意网络
TAN-NTM: Topic Attention Networks for Neural Topic Modeling
论文作者
论文摘要
主题模型已被广泛用于学习文本表示并获得对文档Corpora的洞察力。为了执行主题发现,大多数现有的神经模型要么以文档词袋(BOW)或代币序列为输入,然后进行各种推断和弓重建,以学习主题字的分布。但是,在文档编码过程中利用主题字的分布来学习更好的功能并没有得到太多探索。为此,我们开发了一个框架tan-ntm,该框架通过LSTM来处理文档作为一系列令牌,其上下文输出以主题感知的方式参与。我们提出了一种新颖的关注机制,该机制在主题字分布中的因素使模型能够参与传达与主题相关的线索的相关词。然后,使用主题注意模块的输出来执行各种推理。我们进行了广泛的消融和实验,导致在几个基准数据集中NPMI连贯性中现有SOTA主题模型的得分比9-15个百分比提高了百分比-20newsGroups,Yelp Review pallality and agnews。此外,我们表明我们的方法通过改进了两个下游任务:文档分类和主题引导的键形键形生成,从而学习了更好的潜在文档主题特征与现有主题模型相比。
Topic models have been widely used to learn text representations and gain insight into document corpora. To perform topic discovery, most existing neural models either take document bag-of-words (BoW) or sequence of tokens as input followed by variational inference and BoW reconstruction to learn topic-word distribution. However, leveraging topic-word distribution for learning better features during document encoding has not been explored much. To this end, we develop a framework TAN-NTM, which processes document as a sequence of tokens through a LSTM whose contextual outputs are attended in a topic-aware manner. We propose a novel attention mechanism which factors in topic-word distribution to enable the model to attend on relevant words that convey topic related cues. The output of topic attention module is then used to carry out variational inference. We perform extensive ablations and experiments resulting in ~9-15 percentage improvement over score of existing SOTA topic models in NPMI coherence on several benchmark datasets - 20Newsgroups, Yelp Review Polarity and AGNews. Further, we show that our method learns better latent document-topic features compared to existing topic models through improvement on two downstream tasks: document classification and topic guided keyphrase generation.