论文标题
任务同步的复发神经网络
Task-Synchronized Recurrent Neural Networks
论文作者
论文摘要
数据通常在及时不规则地进行采样。传统上使用复发性神经网络(RNN)来处理此问题,涉及忽略事实,将时间差异作为额外输入或重新采样数据。所有这些方法都有其缺点。我们提出了一种优雅的简单替代方法,其中RNN实际上是在及时重新采样以匹配数据时间或手头任务的。我们使用回声状态网络(ESN)和门控复发单元(GRU)作为解决方案的基础。这样的RNN可以看作是连续时间动力学系统的离散化,这为我们的方法提供了坚实的理论基础。我们的任务同步ESN(TSESN)和GRU(TSGRU)模型允许直接模型设置,并且与常规对应物相比,不需要额外的培训,参数调整或计算(求解微分方程或插值数据),从而保留其原始效率。我们从经验上确认,我们的模型可以有效地补偿数据的时间均匀性,并证明它们与数据重采样,经典RNN方法以及替代RNN模型相比,提出了旨在处理几个现实世界上不合格时间数据集的时间不规则的替代RNN模型。我们在https://github.com/oshapio/task-synchronized-rnns上打开代码。
Data are often sampled irregularly in time. Dealing with this using Recurrent Neural Networks (RNNs) traditionally involved ignoring the fact, feeding the time differences as additional inputs, or resampling the data. All these methods have their shortcomings. We propose an elegant straightforward alternative approach where instead the RNN is in effect resampled in time to match the time of the data or the task at hand. We use Echo State Network (ESN) and Gated Recurrent Unit (GRU) as the basis for our solution. Such RNNs can be seen as discretizations of continuous-time dynamical systems, which gives a solid theoretical ground to our approach. Our Task-Synchronized ESN (TSESN) and GRU (TSGRU) models allow for a direct model time setting and require no additional training, parameter tuning, or computation (solving differential equations or interpolating data) compared to their regular counterparts, thus retaining their original efficiency. We confirm empirically that our models can effectively compensate for the time-non-uniformity of the data and demonstrate that they compare favorably to data resampling, classical RNN methods, and alternative RNN models proposed to deal with time irregularities on several real-world nonuniform-time datasets. We open-source the code at https://github.com/oshapio/task-synchronized-RNNs .