论文标题

霜:更快,更强大的一击半监督训练

FROST: Faster and more Robust One-shot Semi-supervised Training

论文作者

Liu, Helena E., Smith, Leslie N.

论文摘要

一声半监督学习的最新进展降低了深度学习新应用程序的障碍。但是,半监督学习的最先进的训练速度很慢,并且性能对标记的数据和超参数值的选择敏感。在本文中,我们提出了一种单发半监督的学习方法,该方法的训练速度更快,并且比最新方法更强大。具体而言,我们表明,通过将半监督学习与单个网络的自我训练相结合,我们的霜冻方法可以更快地训练,并且更适合选择标签样品的选择和超参数的变化。我们的实验表明,当未标记数据的组成的组成尚不清楚时,弗罗斯特的表现良好。那时,未标记的数据包含每个类的不平等数字,并且可以包含不属于任何培训类的分数示例。高性能,训练速度和对超参数的不敏感性使霜冻成为一次性半监督训练的最实用方法。我们的代码可在https://github.com/helenaeliu/frost上找到。

Recent advances in one-shot semi-supervised learning have lowered the barrier for deep learning of new applications. However, the state-of-the-art for semi-supervised learning is slow to train and the performance is sensitive to the choices of the labeled data and hyper-parameter values. In this paper, we present a one-shot semi-supervised learning method that trains up to an order of magnitude faster and is more robust than state-of-the-art methods. Specifically, we show that by combining semi-supervised learning with a one-stage, single network version of self-training, our FROST methodology trains faster and is more robust to choices for the labeled samples and changes in hyper-parameters. Our experiments demonstrate FROST's capability to perform well when the composition of the unlabeled data is unknown; that is when the unlabeled data contain unequal numbers of each class and can contain out-of-distribution examples that don't belong to any of the training classes. High performance, speed of training, and insensitivity to hyper-parameters make FROST the most practical method for one-shot semi-supervised training. Our code is available at https://github.com/HelenaELiu/FROST.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源