论文标题

快速图形卷积复发性神经网络

Fast Graph Convolutional Recurrent Neural Networks

论文作者

Kadambari, Sai Kiran, Chepuri, Sundeep Prabhakar

论文摘要

本文提出了一个快速的图形卷积神经网络(FGRNN)结构,以预测具有基础图结构的序列。所提出的架构解决了标准复发神经网络(RNN)的局限性,即消失和爆炸梯度,在训练过程中导致数值不稳定性。众所周知,结合了封闭式RNN体系结构的最先进的体系结构,例如长期短期记忆(LSTM)和封闭式复发单元(GRU)和图形卷积,可以在训练阶段提高数值稳定性,但以涉及大量训练参数的模型尺寸为代价。与标准RNN相比,FGRNN通过添加加权剩余连接的加权剩余连接来解决此问题。实际3D点云数据集上的数值实验证实了所提出的体系结构。

This paper proposes a Fast Graph Convolutional Neural Network (FGRNN) architecture to predict sequences with an underlying graph structure. The proposed architecture addresses the limitations of the standard recurrent neural network (RNN), namely, vanishing and exploding gradients, causing numerical instabilities during training. State-of-the-art architectures that combine gated RNN architectures, such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) with graph convolutions are known to improve the numerical stability during the training phase, but at the expense of the model size involving a large number of training parameters. FGRNN addresses this problem by adding a weighted residual connection with only two extra training parameters as compared to the standard RNN. Numerical experiments on the real 3D point cloud dataset corroborates the proposed architecture.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源