论文标题
关于Hopfield网络的学习规则的新见解:内存和目标函数最小化
New Insights on Learning Rules for Hopfield Networks: Memory and Objective Function Minimisation
论文作者
论文摘要
Hopfield神经网络是对生物体中关联记忆进行建模的可能基础。在总结了该领域的先前研究之后,我们对学习规则进行了新的研究,并将其显示为各种成本功能的后裔型算法。我们还提出了一些适合学习的新成本功能。我们讨论了偏见(外部输入)在Hopfield网络中学习过程中的作用。此外,我们将纽顿方法应用于学习记忆,并在实验中比较各种学习规则的表现。最后,为了增加辩论是否允许神经元与自身的连接增强记忆能力,我们数值研究了自我耦合的效果。 关键字:Hopfield网络,关联内存,内容可寻址内存,学习规则,梯度下降,吸引者网络
Hopfield neural networks are a possible basis for modelling associative memory in living organisms. After summarising previous studies in the field, we take a new look at learning rules, exhibiting them as descent-type algorithms for various cost functions. We also propose several new cost functions suitable for learning. We discuss the role of biases (the external inputs) in the learning process in Hopfield networks. Furthermore, we apply Newtons method for learning memories, and experimentally compare the performances of various learning rules. Finally, to add to the debate whether allowing connections of a neuron to itself enhances memory capacity, we numerically investigate the effects of self coupling. Keywords: Hopfield Networks, associative memory, content addressable memory, learning rules, gradient descent, attractor networks