论文标题
通过表示体重平均的代表学习,无元学习
Meta-free few-shot learning via representation learning with weight averaging
论文作者
论文摘要
最新关于使用转移学习的几个射击分类的研究对情节元学习算法的有效性和效率构成了挑战。转移学习方法是一种自然的选择,但仅限于很少的分类。此外,除了某些贝叶斯情节学习算法外,很少有人关注概率模型的开发,这些模型具有良好的不确定性。为了解决上述问题,我们提出了一种新的转移学习方法,以获取几次回归和分类的准确和可靠的模型。所得的方法不需要情节元学习,被称为无元表示学习(MFRL)。 MFRL首先发现在元测试任务上概括了低级别的表示。鉴于学识渊博的表示,概率线性模型用很少的样品进行微调,以获得具有良好校准不确定性的模型。所提出的方法不仅在少数几种基准的基准数据集上达到了最高的精度,而且还可以正确量化预测不确定性。此外,平均体重和温度缩放可以有效地提高在具有广泛学习范式和模型体系结构的现有元学习算法中几乎没有学习的准确性和可靠性。
Recent studies on few-shot classification using transfer learning pose challenges to the effectiveness and efficiency of episodic meta-learning algorithms. Transfer learning approaches are a natural alternative, but they are restricted to few-shot classification. Moreover, little attention has been on the development of probabilistic models with well-calibrated uncertainty from few-shot samples, except for some Bayesian episodic learning algorithms. To tackle the aforementioned issues, we propose a new transfer learning method to obtain accurate and reliable models for few-shot regression and classification. The resulting method does not require episodic meta-learning and is called meta-free representation learning (MFRL). MFRL first finds low-rank representation generalizing well on meta-test tasks. Given the learned representation, probabilistic linear models are fine-tuned with few-shot samples to obtain models with well-calibrated uncertainty. The proposed method not only achieves the highest accuracy on a wide range of few-shot learning benchmark datasets but also correctly quantifies the prediction uncertainty. In addition, weight averaging and temperature scaling are effective in improving the accuracy and reliability of few-shot learning in existing meta-learning algorithms with a wide range of learning paradigms and model architectures.