论文标题
除了没有忘记:多任务学习用于分类的多任务学习
Beyond without Forgetting: Multi-Task Learning for Classification with Disjoint Datasets
论文作者
论文摘要
用于与脱节数据集进行分类的多任务学习(MTL)旨在在一个任务只有一个标记的数据集时探索MTL。在现有方法中,对于每个任务,未完全利用未标记的数据集来促进此任务。受半监督学习的启发,我们使用带有伪标签的未标记数据集来促进每个任务。但是,有两个主要问题:1)伪标签非常嘈杂; 2)每个任务的未标记数据集和标记的数据集具有相当大的数据分布不匹配。为了解决这些问题,我们使用选择性增强(MTL-SA)方法提出了MTL,以在带有自信的伪标签的未标记数据集中选择培训样本,并将数据分配到标记的数据集。然后,我们使用选定的培训样本添加信息并使用其余的培训样本来保存信息。以面部为中心和以人为中心的应用进行广泛的实验证明了我们的MTL-SA方法的有效性。
Multi-task Learning (MTL) for classification with disjoint datasets aims to explore MTL when one task only has one labeled dataset. In existing methods, for each task, the unlabeled datasets are not fully exploited to facilitate this task. Inspired by semi-supervised learning, we use unlabeled datasets with pseudo labels to facilitate each task. However, there are two major issues: 1) the pseudo labels are very noisy; 2) the unlabeled datasets and the labeled dataset for each task has considerable data distribution mismatch. To address these issues, we propose our MTL with Selective Augmentation (MTL-SA) method to select the training samples in unlabeled datasets with confident pseudo labels and close data distribution to the labeled dataset. Then, we use the selected training samples to add information and use the remaining training samples to preserve information. Extensive experiments on face-centric and human-centric applications demonstrate the effectiveness of our MTL-SA method.