论文标题

STNDT:使用时空变压器对神经种群活动进行建模

STNDT: Modeling Neural Population Activity with a Spatiotemporal Transformer

论文作者

Le, Trung, Shlizerman, Eli

论文摘要

建模嘈杂的单审峰活动为基础的神经种群动力学建模对于关联神经观察和行为至关重要。最近的一种非旋转方法 - 神经数据变压器(NDT) - 在没有明确动力学模型的情况下捕获具有低推理潜伏期的神经动力学方面取得了巨大成功。但是,NDT专注于建模人口活动的时间演变,同时忽略各个神经元之间的丰富协调。在本文中,我们介绍了时空神经数据变压器(STNDT),这是一种基于NDT的架构,可以明确地对跨时间和空间中人群中各个神经元的响应进行建模,以揭示其潜在的点火率。此外,我们提出了一种对比对比的学习损失,该损失是根据掩盖建模目标起作用的,以进一步提高预测性能。我们表明,我们的模型在估计四个神经数据集的神经活动方面达到了整体水平上的最新性能,这表明其能力捕获跨越不同皮质区域的自主和非自主动力学,同时完全不可知,同时对当前的特定行为完全不知所措。此外,STNDT空间注意机制揭示了神经元的始终重要子集,这些基因在推动整个人群的反应中起着至关重要的作用,从而提供了对神经元人群如何执行计算方式的可解释性和关键见解。

Modeling neural population dynamics underlying noisy single-trial spiking activities is essential for relating neural observation and behavior. A recent non-recurrent method - Neural Data Transformers (NDT) - has shown great success in capturing neural dynamics with low inference latency without an explicit dynamical model. However, NDT focuses on modeling the temporal evolution of the population activity while neglecting the rich covariation between individual neurons. In this paper we introduce SpatioTemporal Neural Data Transformer (STNDT), an NDT-based architecture that explicitly models responses of individual neurons in the population across time and space to uncover their underlying firing rates. In addition, we propose a contrastive learning loss that works in accordance with mask modeling objective to further improve the predictive performance. We show that our model achieves state-of-the-art performance on ensemble level in estimating neural activities across four neural datasets, demonstrating its capability to capture autonomous and non-autonomous dynamics spanning different cortical regions while being completely agnostic to the specific behaviors at hand. Furthermore, STNDT spatial attention mechanism reveals consistently important subsets of neurons that play a vital role in driving the response of the entire population, providing interpretability and key insights into how the population of neurons performs computation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源