论文标题
通过MINIBATCH和预读的观点培训几射击分类
Training few-shot classification via the perspective of minibatch and pretraining
论文作者
论文摘要
很少有射击分类是一项具有挑战性的任务,旨在制定人类从有限的先前数据中学习概念的能力,并在机器学习中引起了很大的关注。几次分类的最新进展具有元学习的特征,其中定义和培训了用于学习算法的参数化模型,以学习在代表不同的分类任务的非常大的或无限的情节上处理分类任务的能力,每个事件都带有一个小标签的支持集及其相应的查询集。在这项工作中,我们通过将其作为有监督的分类学习问题提出来推进这种少量分类范式。我们进一步提出了多集和跨路训练技术,分别与Minibatch和分类问题预处理相对应。对最先进的几杆分类方法(原型网络)的实验结果表明,两种提出的训练策略都可以高度加速训练过程,而不会准确地损失综合和迷你胶原上的几乎没有射击的分类问题。
Few-shot classification is a challenging task which aims to formulate the ability of humans to learn concepts from limited prior data and has drawn considerable attention in machine learning. Recent progress in few-shot classification has featured meta-learning, in which a parameterized model for a learning algorithm is defined and trained to learn the ability of handling classification tasks on extremely large or infinite episodes representing different classification task, each with a small labeled support set and its corresponding query set. In this work, we advance this few-shot classification paradigm by formulating it as a supervised classification learning problem. We further propose multi-episode and cross-way training techniques, which respectively correspond to the minibatch and pretraining in classification problems. Experimental results on a state-of-the-art few-shot classification method (prototypical networks) demonstrate that both the proposed training strategies can highly accelerate the training process without accuracy loss for varying few-shot classification problems on Omniglot and miniImageNet.