论文标题
路径开发网络具有有限维谎言组代表
Path Development Network with Finite-dimensional Lie Group Representation
论文作者
论文摘要
签名位于粗糙路径理论的核心,是分析由不规则路径驱动的受控微分方程的中心工具。最近,它还发现机器学习和数据科学中的广泛应用是一种数学上有原则的通用功能,可提高顺序数据任务中基于深度学习的模型的性能。然而,当路径高维时,它遭受了维度的诅咒。 我们提出了一个新颖的,可训练的路径开发层,该层通过有限维谎言组利用顺序数据的表示,从而导致降低尺寸。其反向传播算法是通过对流形的优化设计的。我们提出的层类似于复发性神经网络(RNN),具有明确的简单复发单元,可以减轻梯度问题。 我们的层显示了其在不规则时间序列建模中的强度。一系列数据集的经验结果表明,开发层始终如一地优于准确性和维度的签名特征。紧凑型混合模型(用开发层堆叠一层LSTM)可针对各种RNN和连续的时间序列模型实现最先进的方法。我们的一层还提高了限制在谎言组的模型动力学的性能。代码可在https://github.com/pdevnet/devnet.git上找到。
Signature, lying at the heart of rough path theory, is a central tool for analysing controlled differential equations driven by irregular paths. Recently it has also found extensive applications in machine learning and data science as a mathematically principled, universal feature that boosts the performance of deep learning-based models in sequential data tasks. It, nevertheless, suffers from the curse of dimensionality when paths are high-dimensional. We propose a novel, trainable path development layer, which exploits representations of sequential data through finite-dimensional Lie groups, thus resulting in dimension reduction. Its backpropagation algorithm is designed via optimization on manifolds. Our proposed layer, analogous to recurrent neural networks (RNN), possesses an explicit, simple recurrent unit that alleviates the gradient issues. Our layer demonstrates its strength in irregular time series modelling. Empirical results on a range of datasets show that the development layer consistently and significantly outperforms signature features on accuracy and dimensionality. The compact hybrid model (stacking one-layer LSTM with the development layer) achieves state-of-the-art against various RNN and continuous time series models. Our layer also enhances the performance of modelling dynamics constrained to Lie groups. Code is available at https://github.com/PDevNet/DevNet.git.