论文标题
多层物理神经网络中的神经形态过度参数化和少数学习
Neuromorphic Overparameterisation and Few-Shot Learning in Multilayer Physical Neural Networks
论文作者
论文摘要
利用物理系统的复杂动力学的物理神经形态计算已经在复杂和性能方面取得了迅速的进步。物理储层计算是神经形态计算的子集,由于其依赖单个系统而面临局限性。这会限制输出维度和动态范围,从而将性能限制为狭窄的任务范围。在这里,我们设计了一套纳米磁阵列物理储存库的套件,并并行互连和系列以创建多层神经网络体系结构。将一个储层的输出记录,缩放并实际上是输入到下一个储层的输入。这种联网方法增加了输出维度,内部动力学和计算性能。我们证明,物理神经形态系统可以实现过度参数化的状态,从而促进小型训练集的元学习,并在各种任务中产生强大的性能。通过几次学习,该系统迅速适应新任务,进一步证明了我们的方法的功效。
Physical neuromorphic computing, exploiting the complex dynamics of physical systems, has seen rapid advancements in sophistication and performance. Physical reservoir computing, a subset of neuromorphic computing, faces limitations due to its reliance on single systems. This constrains output dimensionality and dynamic range, limiting performance to a narrow range of tasks. Here, we engineer a suite of nanomagnetic array physical reservoirs and interconnect them in parallel and series to create a multilayer neural network architecture. The output of one reservoir is recorded, scaled and virtually fed as input to the next reservoir. This networked approach increases output dimensionality, internal dynamics and computational performance. We demonstrate that a physical neuromorphic system can achieve an overparameterised state, facilitating meta-learning on small training sets and yielding strong performance across a wide range of tasks. Our approach's efficacy is further demonstrated through few-shot learning, where the system rapidly adapts to new tasks.