论文标题

NEDMP:神经增强动态消息传递

NEDMP: Neural Enhanced Dynamic Message Passing

论文作者

Gao, Fei, Zhang, Yan, Zhang, Jiang

论文摘要

在复杂网络上预测随机扩散过程对于流行病控制,舆论传播和病毒营销至关重要。我们关注的问题是推断每个节点的时间依赖性边缘概率,该节点共同量化了扩散结果。动态消息传递(DMP)已被开发为多种传播模型的有效推理算法,并且在局部树状网络上渐近地精确。但是,DMP可能在具有许多本地循环的扩散网络中挣扎。我们通过使用图形神经网络(GNN)来解决此限制,以隐式地学习消息之间的依赖关系。具体而言,我们提出了一个混合模型,其中GNN模块与DMP方程共同运行。 GNN模块通过从模拟数据中学习DMP迭代中的汇总消息。我们在数值上证明,在训练后,在各种网络结构和动态参数的条件下,模型的推理精度基本上优于DMP。此外,与纯数据驱动的模型相比,提出的混合模型具有更好的训练案例的概括能力,从混合模型中的明确使用的动力学先验中获利。我们的模型的Pytorch实现是在https://github.com/feigsss/nedmp上。

Predicting stochastic spreading processes on complex networks is critical in epidemic control, opinion propagation, and viral marketing. We focus on the problem of inferring the time-dependent marginal probabilities of states for each node which collectively quantifies the spreading results. Dynamic Message Passing (DMP) has been developed as an efficient inference algorithm for several spreading models, and it is asymptotically exact on locally tree-like networks. However, DMP can struggle in diffusion networks with lots of local loops. We address this limitation by using Graph Neural Networks (GNN) to learn the dependency amongst messages implicitly. Specifically, we propose a hybrid model in which the GNN module runs jointly with DMP equations. The GNN module refines the aggregated messages in DMP iterations by learning from simulation data. We demonstrate numerically that after training, our model's inference accuracy substantially outperforms DMP in conditions of various network structure and dynamics parameters. Moreover, compared to pure data-driven models, the proposed hybrid model has a better generalization ability for out-of-training cases, profiting from the explicitly utilized dynamics priors in the hybrid model. A PyTorch implementation of our model is at https://github.com/FeiGSSS/NEDMP.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源