论文标题

超复合值重复复发相关神经网络

Hypercomplex-Valued Recurrent Correlation Neural Networks

论文作者

Valle, Marcos Eduardo, Lobo, Rodolfo Anibal

论文摘要

Chiueh和Goodman引入的重复相关神经网络(RCNN)作为基于双极相关的Hopfield神经网络的改进版本,可用于实现高容量的关联记忆。在本文中,我们扩展了双极rcnns,用于处理超复杂值数据。确切地说,我们介绍了一类广泛的超复杂值RCNN的数学背景。然后,我们提供必要的条件,以确保使用同步或异步更新模式使用超复合值的RCNN始终以平衡沉降。给出了双极,复杂,双曲线,四元素和八元价值的RCNN的示例,以说明理论结果。最后,计算实验证实了超复合值的rcnns作为旨在存储和回忆灰度图像的关联记忆的潜在应用。

Recurrent correlation neural networks (RCNNs), introduced by Chiueh and Goodman as an improved version of the bipolar correlation-based Hopfield neural network, can be used to implement high-capacity associative memories. In this paper, we extend the bipolar RCNNs for processing hypercomplex-valued data. Precisely, we present the mathematical background for a broad class of hypercomplex-valued RCNNs. Then, we provide the necessary conditions which ensure that a hypercomplex-valued RCNN always settles at an equilibrium using either synchronous or asynchronous update modes. Examples with bipolar, complex, hyperbolic, quaternion, and octonion-valued RCNNs are given to illustrate the theoretical results. Finally, computational experiments confirm the potential application of hypercomplex-valued RCNNs as associative memories designed for the storage and recall of gray-scale images.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源