论文标题

元基线:探索简单的元学习以进行几次学习

Meta-Baseline: Exploring Simple Meta-Learning for Few-Shot Learning

论文作者

Chen, Yinbo, Liu, Zhuang, Xu, Huijuan, Darrell, Trevor, Wang, Xiaolong

论文摘要

近年来,元学习一直是几次学习的最常见框架。它从少数射击分类任务的集合中学习了模型,据信这具有使培训目标与测试目标一致的关键优势。但是,最近的一些作品报告说,通过培训全分类,即整个标签集的分类,它可以比许多元学习算法获得可比性甚至更好的嵌入。这两条作品之间的边缘尚未得到充满意,并且在几次学习中学习的有效性尚不清楚。在本文中,我们探讨了一个简单的过程:对整个分类预培训模型的元学习。我们观察到这种简单的方法可以在标准基准的最先进方法上实现竞争性能。我们的进一步分析阐明了了解元学习目标与几次学习中的整个分类目标之间的权衡。

Meta-learning has been the most common framework for few-shot learning in recent years. It learns the model from collections of few-shot classification tasks, which is believed to have a key advantage of making the training objective consistent with the testing objective. However, some recent works report that by training for whole-classification, i.e. classification on the whole label-set, it can get comparable or even better embedding than many meta-learning algorithms. The edge between these two lines of works has yet been underexplored, and the effectiveness of meta-learning in few-shot learning remains unclear. In this paper, we explore a simple process: meta-learning over a whole-classification pre-trained model on its evaluation metric. We observe this simple method achieves competitive performance to state-of-the-art methods on standard benchmarks. Our further analysis shed some light on understanding the trade-offs between the meta-learning objective and the whole-classification objective in few-shot learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源