论文标题

通过耦合条件变异自动编码器适应了广义的零射击域

Generalized Zero-Shot Domain Adaptation via Coupled Conditional Variational Autoencoders

论文作者

Wang, Qian, Breckon, Toby P.

论文摘要

域的适应方法旨在从源域中利用有用的信息,在该源域中,在没有或有限的此类示例可用性的情况下,更容易获得监督的学习示例来解决目标域中的学习问题。在分类问题中,在不同的监督,无监督和半监督条件下研究了域的适应性。但是,当可作为目标域类的子集可用的标签样品可用时,常见的情况已被忽略。在本文中,我们通过将标记的源域样本视为零拍学习的语义表示,在广义的零照片学习框架内提出了这个特定的域适应问题。对于这个特定的问题,传统的域适应方法既不是直接应用的零射击学习算法。为了解决这个广义的零射击结构域的适应问题,我们提出了一种新型的耦合条件变分自动编码器(CCVAE),该变化型自动编码器(CCVAE)可以从其源域对应物中生成未看到类的合成目标域特征。已经在三个域适应数据集上进行了广泛的实验,包括定制的X射线安全检查点数据集,以模拟航空安全性中的真实应用程序。结果证明了我们提出的方法针对既定基准和现实世界中的适用性的有效性。

Domain adaptation approaches aim to exploit useful information from the source domain where supervised learning examples are easier to obtain to address a learning problem in the target domain where there is no or limited availability of such examples. In classification problems, domain adaptation has been studied under varying supervised, unsupervised and semi-supervised conditions. However, a common situation when the labelled samples are available for a subset of target domain classes has been overlooked. In this paper, we formulate this particular domain adaptation problem within a generalized zero-shot learning framework by treating the labelled source domain samples as semantic representations for zero-shot learning. For this particular problem, neither conventional domain adaptation approaches nor zero-shot learning algorithms directly apply. To address this generalized zero-shot domain adaptation problem, we present a novel Coupled Conditional Variational Autoencoder (CCVAE) which can generate synthetic target domain features for unseen classes from their source domain counterparts. Extensive experiments have been conducted on three domain adaptation datasets including a bespoke X-ray security checkpoint dataset to simulate a real-world application in aviation security. The results demonstrate the effectiveness of our proposed approach both against established benchmarks and in terms of real-world applicability.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源