论文标题

STYLETIME:合成时间序列的样式转移

StyleTime: Style Transfer for Synthetic Time Series Generation

论文作者

El-Laham, Yousef, Vyetrenko, Svitlana

论文摘要

神经风格转移是一种强大的计算机视觉技术,可以将一个图像的艺术“样式”纳入另一个图像的“内容”。该方法背后的基本理论取决于以下假设:图像的样式由其特征的革兰氏矩阵表示,该矩阵通常是从预先训练的卷积神经网络(例如VGG-19)中提取的。这个想法并不能直接扩展到时间序列风格化,因为二维图像的样式概念与一维时间序列的样式概念不类似。在这项工作中,提出了一种新颖的时间序列样式转移,以实现合成数据生成和增强的目的。我们介绍了时间序列的程式化特征的概念,该特征与时间序列现实主义属性直接相关,并提出了一种称为STYLETIME的新型风格化算法,该算法使用明确的特征提取技术将一个时间序列的基本内容(趋势)与另一个样式(分配属性)相结合。此外,我们讨论了评估指标,并将我们的工作与现有的最新时间序列生成和增强方案进行了比较。为了验证我们方法的有效性,我们使用风格化的合成数据作为数据增强的手段,以提高几个预测任务上经常性神经网络模型的性能。

Neural style transfer is a powerful computer vision technique that can incorporate the artistic "style" of one image to the "content" of another. The underlying theory behind the approach relies on the assumption that the style of an image is represented by the Gram matrix of its features, which is typically extracted from pre-trained convolutional neural networks (e.g., VGG-19). This idea does not straightforwardly extend to time series stylization since notions of style for two-dimensional images are not analogous to notions of style for one-dimensional time series. In this work, a novel formulation of time series style transfer is proposed for the purpose of synthetic data generation and enhancement. We introduce the concept of stylized features for time series, which is directly related to the time series realism properties, and propose a novel stylization algorithm, called StyleTime, that uses explicit feature extraction techniques to combine the underlying content (trend) of one time series with the style (distributional properties) of another. Further, we discuss evaluation metrics, and compare our work to existing state-of-the-art time series generation and augmentation schemes. To validate the effectiveness of our methods, we use stylized synthetic data as a means for data augmentation to improve the performance of recurrent neural network models on several forecasting tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源