论文标题
基于注意的聚类:从上下文学习内核
Attention-Based Clustering: Learning a Kernel from Context
论文作者
论文摘要
在机器学习中,没有一个数据点独自一人。我们认为,在许多机器学习方法中,上下文是一个未充分考虑的概念。我们建议基于注意力集群(ABC),这是一种基于注意力机制的神经体系结构,旨在学习适应输入集中上下文的潜在表示,并且本质上对输入大小和群集数量固有地位。通过学习相似性内核,我们的方法直接将基于任何基于内核的聚类方法结合在一起。我们提出了聚类综合特征的竞争结果,并包括基于注意力的方法聚类的有效性的分析证据。
In machine learning, no data point stands alone. We believe that context is an underappreciated concept in many machine learning methods. We propose Attention-Based Clustering (ABC), a neural architecture based on the attention mechanism, which is designed to learn latent representations that adapt to context within an input set, and which is inherently agnostic to input sizes and number of clusters. By learning a similarity kernel, our method directly combines with any out-of-the-box kernel-based clustering approach. We present competitive results for clustering Omniglot characters and include analytical evidence of the effectiveness of an attention-based approach for clustering.