论文标题
带有完全不平衡标签的网络嵌入
Network Embedding with Completely-imbalanced Labels
论文作者
论文摘要
旨在将网络投射到低维空间中的网络嵌入越来越多地成为网络研究的重点。半监督的网络嵌入利用了标记的数据,并显示出有希望的性能。但是,现有的半监督方法将在完全不平衡的标签设置中获得不吸引人的结果,在某些类中根本没有标记的节点。为了减轻这一点,我们提出了两种新型的半监督网络嵌入方法。第一个是名为rsdne的浅方法。具体而言,为了从完全平衡的标签中受益,rsdne可以近似方式保证类内部相似性和类间差异。另一种方法是RECT,这是一类新的图形神经网络。与RSDNE不同,从完全平衡的标签中受益,Rect探索了班级语义知识。这使RECT可以处理具有节点功能和多标签设置的网络。几个现实世界数据集的实验结果证明了所提出的方法的优越性。代码可从https://github.com/zhengwang100/rect获得。
Network embedding, aiming to project a network into a low-dimensional space, is increasingly becoming a focus of network research. Semi-supervised network embedding takes advantage of labeled data, and has shown promising performance. However, existing semi-supervised methods would get unappealing results in the completely-imbalanced label setting where some classes have no labeled nodes at all. To alleviate this, we propose two novel semi-supervised network embedding methods. The first one is a shallow method named RSDNE. Specifically, to benefit from the completely-imbalanced labels, RSDNE guarantees both intra-class similarity and inter-class dissimilarity in an approximate way. The other method is RECT which is a new class of graph neural networks. Different from RSDNE, to benefit from the completely-imbalanced labels, RECT explores the class-semantic knowledge. This enables RECT to handle networks with node features and multi-label setting. Experimental results on several real-world datasets demonstrate the superiority of the proposed methods. Code is available at https://github.com/zhengwang100/RECT.