论文标题
Omni-Scale CNNS:时间序列分类的简单有效的内核尺寸配置
Omni-Scale CNNs: a simple and effective kernel size configuration for time series classification
论文作者
论文摘要
接受场(RF)的大小一直是时间序列分类任务中一维卷积神经网络(1D-CNN)的最重要因素之一。选择适当尺寸的巨大努力,因为它对性能有很大的影响,并且每个数据集都有很大的不同。在本文中,我们为1D-CNN提出了一个Omni级块(OS-Block),其中内核大小由简单而通用的规则决定。特别是,它是一组内核大小,可以根据时间序列的长度通过多个素数来有效地覆盖不同数据集的最佳RF大小。实验结果表明,具有OSBLOCK的模型可以达到与搜索最佳RF尺寸的模型相似的性能,并且由于具有强大的最佳RF尺寸捕获能力,具有OS-Block的简单1D-CNN模型可以在四个时间序列基准上实现最先进的性能,包括来自多个域的Univariate和Multivariate数据。全面的分析和讨论阐明了为什么OS-Block可以在不同数据集中捕获最佳的RF尺寸。可用代码[https://github.com/wensi-tang/os-cnn]
The Receptive Field (RF) size has been one of the most important factors for One Dimensional Convolutional Neural Networks (1D-CNNs) on time series classification tasks. Large efforts have been taken to choose the appropriate size because it has a huge influence on the performance and differs significantly for each dataset. In this paper, we propose an Omni-Scale block (OS-block) for 1D-CNNs, where the kernel sizes are decided by a simple and universal rule. Particularly, it is a set of kernel sizes that can efficiently cover the best RF size across different datasets via consisting of multiple prime numbers according to the length of the time series. The experiment result shows that models with the OS-block can achieve a similar performance as models with the searched optimal RF size and due to the strong optimal RF size capture ability, simple 1D-CNN models with OS-block achieves the state-of-the-art performance on four time series benchmarks, including both univariate and multivariate data from multiple domains. Comprehensive analysis and discussions shed light on why the OS-block can capture optimal RF sizes across different datasets. Code available [https://github.com/Wensi-Tang/OS-CNN]