论文标题

gransformer:基于变压器的图生成

Gransformer: Transformer-based Graph Generation

论文作者

Khajenezhad, Ahmad, Osia, Seyed Ali, Karimian, Mahmood, Beigy, Hamid

论文摘要

变压器已广泛用于各种任务,例如自然语言处理和机器视觉。本文提出了GransFormer,这是一种基于用于生成图形的变压器的算法。我们修改变压器编码器以利用给定图的结构信息。注意机制适合考虑每对节点之间存在边缘或不存在边缘。我们还在节点对之间引入了基于图的熟悉度度量,该度量既适用于注意力和位置编码。这种熟悉度的度量基于消息通讯算法,并包含有关图形的结构信息。同样,此措施是自动回旋的,这使我们的模型可以在单个前向传球中获取必要的条件概率。在输出层中,我们还使用蒙版的自动编码器进行密度估计,以有效地建模连接到每个节点的依赖边的顺序生成。此外,我们提出了一种技术,以防止模型通过使用BFS节点排序与前面的节点相关的无连接而产生孤立的节点。我们使用合成和现实世界数据集评估了此方法,并将其与相关的数据集进行了比较,包括经常性模型和图形卷积网络。实验结果表明,所提出的方法与这些方法相对执行。

Transformers have become widely used in various tasks, such as natural language processing and machine vision. This paper proposes Gransformer, an algorithm based on Transformer for generating graphs. We modify the Transformer encoder to exploit the structural information of the given graph. The attention mechanism is adapted to consider the presence or absence of edges between each pair of nodes. We also introduce a graph-based familiarity measure between node pairs that applies to both the attention and the positional encoding. This measure of familiarity is based on message-passing algorithms and contains structural information about the graph. Also, this measure is autoregressive, which allows our model to acquire the necessary conditional probabilities in a single forward pass. In the output layer, we also use a masked autoencoder for density estimation to efficiently model the sequential generation of dependent edges connected to each node. In addition, we propose a technique to prevent the model from generating isolated nodes without connection to preceding nodes by using BFS node orderings. We evaluate this method using synthetic and real-world datasets and compare it with related ones, including recurrent models and graph convolutional networks. Experimental results show that the proposed method performs comparatively to these methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源