论文标题
通过定期全球指导的本地监督学习
Locally Supervised Learning with Periodic Global Guidance
论文作者
论文摘要
本地监督的学习旨在根据网络每个解耦模块的全球损耗函数的局部估计来训练神经网络。通常将辅助网络附加到模块上,以根据贪婪的本地损失近似梯度更新。尽管在平行性和减少记忆消耗方面具有优势,但这种训练的范式严重降低了神经网络的概括性能。在本文中,我们建议定期指导本地学习(PGL),该学习将全球客观重复地重复地重复纳入基于局部损失的神经网络的培训,主要是增强模型的泛化能力。我们表明,一个简单的周期性指导方案在记忆力低下的同时会带来显着的性能。我们在各种数据集和网络上进行了广泛的实验,以证明PGL的有效性,尤其是在具有许多解耦模块的配置中。
Locally supervised learning aims to train a neural network based on a local estimation of the global loss function at each decoupled module of the network. Auxiliary networks are typically appended to the modules to approximate the gradient updates based on the greedy local losses. Despite being advantageous in terms of parallelism and reduced memory consumption, this paradigm of training severely degrades the generalization performance of neural networks. In this paper, we propose Periodically Guided local Learning (PGL), which reinstates the global objective repetitively into the local-loss based training of neural networks primarily to enhance the model's generalization capability. We show that a simple periodic guidance scheme begets significant performance gains while having a low memory footprint. We conduct extensive experiments on various datasets and networks to demonstrate the effectiveness of PGL, especially in the configuration with numerous decoupled modules.