论文标题
费用:对截面的季节性趋势表示的对比度学习时间序列预测
CoST: Contrastive Learning of Disentangled Seasonal-Trend Representations for Time Series Forecasting
论文作者
论文摘要
深度学习已被积极研究时间序列预测,主流范式基于对神经网络体系结构的端到端培训,从经典的LSTM/RNN到最近的TCN和变形金刚。我们认为,由计算机视觉和自然语言处理中的代表性学习成功的激励,我们认为,时间序列预测的更有希望的范式是首先学习脱离词性的特征表示,然后是一个简单的回归微调步骤 - 我们从因果的角度证明了这样的范式。遵循这一原则,我们提出了一个新的时间序列表示框架的时间序列预测命名为COST,该框架使用对比学习方法来学习解开的季节性趋势表示。成本分别包括时间域和频域对比损失,分别学习歧视性趋势和季节性表示。对现实世界数据集的广泛实验表明,成本始终如一地优于最先进的方法,其差距可观,在多元基准测试方面的MSE提高了21.3%。对于主干编码器以及下游回归器的各种选择也是强大的。代码可在https://github.com/salesforce/cost上找到。
Deep learning has been actively studied for time series forecasting, and the mainstream paradigm is based on the end-to-end training of neural network architectures, ranging from classical LSTM/RNNs to more recent TCNs and Transformers. Motivated by the recent success of representation learning in computer vision and natural language processing, we argue that a more promising paradigm for time series forecasting, is to first learn disentangled feature representations, followed by a simple regression fine-tuning step -- we justify such a paradigm from a causal perspective. Following this principle, we propose a new time series representation learning framework for time series forecasting named CoST, which applies contrastive learning methods to learn disentangled seasonal-trend representations. CoST comprises both time domain and frequency domain contrastive losses to learn discriminative trend and seasonal representations, respectively. Extensive experiments on real-world datasets show that CoST consistently outperforms the state-of-the-art methods by a considerable margin, achieving a 21.3% improvement in MSE on multivariate benchmarks. It is also robust to various choices of backbone encoders, as well as downstream regressors. Code is available at https://github.com/salesforce/CoST.