论文标题
地图集:通用函数近似器用于保留内存
ATLAS: Universal Function Approximator for Memory Retention
论文作者
论文摘要
Artificial neural networks (ANNs), despite their universal function approximation capability and practical success, are subject to catastrophic forgetting.灾难性忘记是指学习新任务时的突然学习。这是一种妨碍持续学习的新兴现象。 Existing universal function approximation theorems for ANNs guarantee function approximation ability, but do not predict catastrophic forgetting. This paper presents a novel universal approximation theorem for multi-variable functions using only single-variable functions and exponential functions.此外,我们介绍了Atlas:基于新定理的新颖Ann建筑。结果表明,地图集是能够保留某些内存和持续学习的通用函数近似器。地图集的记忆是不完美的,在持续学习过程中具有一些脱离目标的效果,但行为良好且可预测。提供了有效的地图集。进行实验以评估Atlas的功能近似和记忆保留能力。
Artificial neural networks (ANNs), despite their universal function approximation capability and practical success, are subject to catastrophic forgetting. Catastrophic forgetting refers to the abrupt unlearning of a previous task when a new task is learned. It is an emergent phenomenon that hinders continual learning. Existing universal function approximation theorems for ANNs guarantee function approximation ability, but do not predict catastrophic forgetting. This paper presents a novel universal approximation theorem for multi-variable functions using only single-variable functions and exponential functions. Furthermore, we present ATLAS: a novel ANN architecture based on the new theorem. It is shown that ATLAS is a universal function approximator capable of some memory retention, and continual learning. The memory of ATLAS is imperfect, with some off-target effects during continual learning, but it is well-behaved and predictable. An efficient implementation of ATLAS is provided. Experiments are conducted to evaluate both the function approximation and memory retention capabilities of ATLAS.