论文标题
删除粘性层次的迪里奇特过程隐藏马尔可夫模型
Disentangled Sticky Hierarchical Dirichlet Process Hidden Markov Model
论文作者
论文摘要
层次差异过程隐藏的马尔可夫模型(HDP-HMM)已被广泛用作经典隐藏的马尔可夫模型的天然贝叶斯非参数扩展,用于从顺序和时间序列数据中学习。已经提出了HDP-HMM的粘性扩展,以增强HDP-HMM中的自我史上概率。但是,粘性的HDP-hmm纠缠了先前的自我持久性和过渡的力量,从而限制了其表现力。在这里,我们提出了一个更通用的模型:脱离粘性HDP-HMM(DS-HDP-HMM)。我们开发了新型的Gibbs采样算法,以在此模型中有效地推断。我们表明,在综合和真实数据上,张开的粘性HDP-HMM优于粘性HDP-HMM和HDP-HMM,并应用新方法来分析神经数据和细分行为视频数据。
The Hierarchical Dirichlet Process Hidden Markov Model (HDP-HMM) has been used widely as a natural Bayesian nonparametric extension of the classical Hidden Markov Model for learning from sequential and time-series data. A sticky extension of the HDP-HMM has been proposed to strengthen the self-persistence probability in the HDP-HMM. However, the sticky HDP-HMM entangles the strength of the self-persistence prior and transition prior together, limiting its expressiveness. Here, we propose a more general model: the disentangled sticky HDP-HMM (DS-HDP-HMM). We develop novel Gibbs sampling algorithms for efficient inference in this model. We show that the disentangled sticky HDP-HMM outperforms the sticky HDP-HMM and HDP-HMM on both synthetic and real data, and apply the new approach to analyze neural data and segment behavioral video data.