论文标题

隐藏马尔可夫链与复发性神经网络的表达性从系统理论观点

Expressivity of Hidden Markov Chains vs. Recurrent Neural Networks from a system theoretic viewpoint

论文作者

Desbouvries, François, Petetin, Yohan, Salaün, Achille

论文摘要

隐藏的马尔可夫链(HMC)和复发性神经网络(RNN)是预测时间序列的两个知名工具。即使这些解决方案是在不同的社区中独立开发的,但在被视为概率结构时,它们具有一些相似之处。因此,在本文中,我们首先将HMC和RNN视为生成模型,然后将这两个结构嵌入了共同的生成统一模型(GUM)中。接下来,我们讨论了这些模型表达性的比较研究。为此,我们假设模型是线性和高斯。这些模型产生的概率分布以结构化协方差序列为特征,因此表达性降低到比较结构化协方差序列的集合,这使我们能够要求随机实现理论(SRT)。我们最终提供了可以通过口香糖,HMC或RNN实现给定协方差序列的条件。

Hidden Markov Chains (HMC) and Recurrent Neural Networks (RNN) are two well known tools for predicting time series. Even though these solutions were developed independently in distinct communities, they share some similarities when considered as probabilistic structures. So in this paper we first consider HMC and RNN as generative models, and we embed both structures in a common generative unified model (GUM). We next address a comparative study of the expressivity of these models. To that end we assume that the models are furthermore linear and Gaussian. The probability distributions produced by these models are characterized by structured covariance series, and as a consequence expressivity reduces to comparing sets of structured covariance series, which enables us to call for stochastic realization theory (SRT). We finally provide conditions under which a given covariance series can be realized by a GUM, an HMC or an RNN.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源