论文标题

一个统一的框架,具有元驱动器,用于几次学习

A Unified Framework with Meta-dropout for Few-shot Learning

论文作者

Lin, Shaobo, Zeng, Xingyu, Zhao, Rui

论文摘要

深度神经网络的常规培训通常需要大量的数据,并具有昂贵的人类注释。在本文中,我们利用元学习的概念来解释两个截然不同的学习流,即基于情节的元学习和基于FineTune的曲线前的少量学习,并形成一个统一的元学习框架。为了提高框架的概括能力,我们提出了一种名为Meta-Dropout的简单而有效的策略,该策略适用于从基本类别到新颖类别的可转移知识。提出的策略可以有效地防止神经单位在元训练阶段过度共同适应。关于少数拍物检测和少量图像分类数据集的广泛实验,即Pascal VOC,MS Coco,Cub和Mini-Imagenet,验证了我们方法的有效性。

Conventional training of deep neural networks usually requires a substantial amount of data with expensive human annotations. In this paper, we utilize the idea of meta-learning to explain two very different streams of few-shot learning, i.e., the episodic meta-learning-based and pre-train finetune-based few-shot learning, and form a unified meta-learning framework. In order to improve the generalization power of our framework, we propose a simple yet effective strategy named meta-dropout, which is applied to the transferable knowledge generalized from base categories to novel categories. The proposed strategy can effectively prevent neural units from co-adapting excessively in the meta-training stage. Extensive experiments on the few-shot object detection and few-shot image classification datasets, i.e., Pascal VOC, MS COCO, CUB, and mini-ImageNet, validate the effectiveness of our method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源