论文标题
用于多二维时时间序列数据成像的数据折叠和超空间编码
Data-Folding and Hyperspace Coding for Multi-Dimensonal Time-Series Data Imaging
论文作者
论文摘要
多维时间序列分类和预测已在许多领域中广泛使用,例如预防疾病,断层诊断和动作识别。但是,传统方法需要手动干预和推理,并且无法实现多维数据的象征性表达,这导致信息挖掘不足。受深度学习技术在图像处理中的强大力量的启发,我们提出了一个统一的时间序列图像融合框架,以将多模式数据转换为2D图像,然后基于轻量级卷积神经网络实现自动特征提取和分类。我们提供两种基本的图像编码方法:灰色图像编码,RGB图像编码及其步骤编码方法。考虑到不同应用程序字段的通用性,我们扩展了编码方法,并提出了两种类型的变换编码,转换RGB编码和RGB转换编码,以提高多域表示能力。通过适用于帕金森氏病诊断的三个典型场景,轴承故障检测和体操动作识别,我们获得了100%,92.86%和99.70%的最高分类精度,这些精度均高于经典处理方法。它证明了我们的编码框架对不同的多维场景的强大分类能力和普遍性。我们希望在其他情况下可以使用并良好使用此方法,并有可能促进相关技术的进步。
Multi-Dimensional time series classification and prediction has been widely used in many fields, such as disease prevention, fault diagnosis and action recognition. However, the traditional method needs manual intervention and inference, and cannot realize the figurative expression of multi-Dimensional data, which lead to inadequate information mining. Inspired by the strong power of deep learning technology in image processing, we propose a unified time-series image fusion framework to transform multi-modal data into 2D-image, and then realize automatic feature extraction and classification based on a lightweight convolutional neural network. We present two basic image coding methods, Gray image coding, RGB image coding, and their step coding methods. Considering the universality of different application fields, we extended the coding method and propose two types transform coding, Transform-RGB coding and RGB-Transform coding, to improve the multi-domain representation ability. By applying to three typical scenes of Parkinson's disease diagnosis, bearing fault detection and gymnastics action recognition, we obtained the highest classification accuracy of 100%, 92.86% and 99.70% respectively, which were all higher than the classical processing methods. It proves the strong classification ability and universality of our coding framework to different multi-dimensional scenes. We expect that this method can be used and perform well in other scenarios, and be potential to facilitate the progress of related technology.