论文标题
在线和离线专家非平稳时间序列的混合物
Mixture of Online and Offline Experts for Non-stationary Time Series
论文作者
论文摘要
我们考虑了一个涉及非平稳时间序列的一般和现实的场景,该场景由几个离线间隔组成,其中有固定离线时间范围内的不同分布,以及连续接收新样本的在线间隔。对于非平稳时间序列,当前在线间隔中的数据分布可能以先前的离线间隔出现。从理论上讲,我们探讨了将离线间隔内知识应用于当前在线间隔的可行性。为此,我们提出了在线和离线专家(MOOE)的混合。 MOOE从离线间隔学习静态离线专家,并为当前在线间隔维护动态的在线专家。然后,它使用元专家自适应地将离线专家和在线专家组合在一起,以对在线间隔中收到的样本进行预测。具体而言,我们专注于理论分析,得出参数收敛,后悔界限和概括误差界限,以证明算法的有效性。
We consider a general and realistic scenario involving non-stationary time series, consisting of several offline intervals with different distributions within a fixed offline time horizon, and an online interval that continuously receives new samples. For non-stationary time series, the data distribution in the current online interval may have appeared in previous offline intervals. We theoretically explore the feasibility of applying knowledge from offline intervals to the current online interval. To this end, we propose the Mixture of Online and Offline Experts (MOOE). MOOE learns static offline experts from offline intervals and maintains a dynamic online expert for the current online interval. It then adaptively combines the offline and online experts using a meta expert to make predictions for the samples received in the online interval. Specifically, we focus on theoretical analysis, deriving parameter convergence, regret bounds, and generalization error bounds to prove the effectiveness of the algorithm.