论文标题
小波网络:从原始时间序列中学习比例 - 翻译
Wavelet Networks: Scale-Translation Equivariant Learning From Raw Time-Series
论文作者
论文摘要
利用特定数据域固有的对称性来构建模棱两可的神经网络,这在数据效率和概括方面取得了显着改善。但是,大多数现有的研究都集中在平面和体积数据引起的对称性上,而至关重要的数据源在很大程度上尚未得到充实:时间序列。在这项工作中,我们通过利用时间序列固有的对称性来填补这一空白。我们确定了两个核心对称性: *缩放和翻译 *,并构建时间序列学习的尺度翻译模棱两可的神经网络。有趣的是,我们发现尺度翻译模棱两可的映射与小波变换具有很强的相似之处。受此相似之处的启发,我们称我们的网络小波网络,并表明它们执行了嵌套的非线性小波般的时频变换。经验结果表明,小波网络在原始波形上的表现优于常规的CNN,并且在几种任务和时间序列类型上匹配了强烈设计的光谱图技术,包括音频,环境声音和电信号。我们的代码可在https://github.com/dwavelet_networks上公开获取。
Leveraging the symmetries inherent to specific data domains for the construction of equivariant neural networks has lead to remarkable improvements in terms of data efficiency and generalization. However, most existing research focuses on symmetries arising from planar and volumetric data, leaving a crucial data source largely underexplored: time-series. In this work, we fill this gap by leveraging the symmetries inherent to time-series for the construction of equivariant neural network. We identify two core symmetries: *scale and translation*, and construct scale-translation equivariant neural networks for time-series learning. Intriguingly, we find that scale-translation equivariant mappings share strong resemblance with the wavelet transform. Inspired by this resemblance, we term our networks Wavelet Networks, and show that they perform nested non-linear wavelet-like time-frequency transforms. Empirical results show that Wavelet Networks outperform conventional CNNs on raw waveforms, and match strongly engineered spectrogram techniques across several tasks and time-series types, including audio, environmental sounds, and electrical signals. Our code is publicly available at https://github.com/dwromero/wavelet_networks.