论文标题

复发性神经网络中的门控理论

Theory of gating in recurrent neural networks

论文作者

Krishnamurthy, Kamesh, Can, Tankut, Schwab, David J.

论文摘要

复发性神经网络(RNN)是强大的动态模型,广泛用于机器学习(ML)和神经科学。先前的理论工作集中在具有加性相互作用的RNN上。但是,门控 - 即乘法 - 相互作用在实际神经元中无处不在,也是ML中表现最佳的RNN的核心特征。在这里,我们表明,门控提供了集体动力学的两个显着特征的灵活控制:i)时标和ii)维度。栅极控制时间尺度导致一个新颖的,边缘稳定的状态,该状态在该状态下充当灵活的积分器。与以前的方法不同,门控允许无参数微调或特殊对称性的重要功能。门还提供了一种灵活的,与上下文相关的机制来重置内存跟踪,从而补充了内存函数。调节维度的栅极可以引起新颖的,不连续的混沌过渡,其中输入将稳定的系统推向了强大的混乱活动,与通常稳定的输入效果相反。在这种过渡时,与添加剂RNN不同,临界点(拓扑复杂性)的增殖与混沌动力学的外观(动力学复杂性)脱钩。 丰富的动力学总结在相图中,从而为ML从业者提供了原则参数初始化选择的地图。

Recurrent neural networks (RNNs) are powerful dynamical models, widely used in machine learning (ML) and neuroscience. Prior theoretical work has focused on RNNs with additive interactions. However, gating - i.e. multiplicative - interactions are ubiquitous in real neurons and also the central feature of the best-performing RNNs in ML. Here, we show that gating offers flexible control of two salient features of the collective dynamics: i) timescales and ii) dimensionality. The gate controlling timescales leads to a novel, marginally stable state, where the network functions as a flexible integrator. Unlike previous approaches, gating permits this important function without parameter fine-tuning or special symmetries. Gates also provide a flexible, context-dependent mechanism to reset the memory trace, thus complementing the memory function. The gate modulating the dimensionality can induce a novel, discontinuous chaotic transition, where inputs push a stable system to strong chaotic activity, in contrast to the typically stabilizing effect of inputs. At this transition, unlike additive RNNs, the proliferation of critical points (topological complexity) is decoupled from the appearance of chaotic dynamics (dynamical complexity). The rich dynamics are summarized in phase diagrams, thus providing a map for principled parameter initialization choices to ML practitioners.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源