论文标题

学会构成零摄的组成提示

Learning to Compose Soft Prompts for Compositional Zero-Shot Learning

论文作者

Nayak, Nihal V., Yu, Peilin, Bach, Stephen H.

论文摘要

我们引入了构图软提示(CSP),这是一种参数效率高效的学习技术,可改善大规模预处理视觉模型(VLMS)(如剪辑)的零摄像组成性。我们开发了用于组成零拍学习的CSP,这是预测看不见的属性对象组成的任务(例如,老猫和年轻的老虎)。 VLM具有一个灵活的文本编码器,可以将任意类表示为自然语言提示,但它们通常在组成的零摄影基准数据集上表现特定于任务的架构。 CSP将定义为词汇的可学习令牌的属性和对象视为。在训练过程中,词汇量调整为识别以多种方式构成令牌的课程(例如,老猫和白猫)。在测试时,我们在新组合中重新组建了学习的属性 - 对象词汇,以识别新的类别。我们表明,CSP在AUC上平均比基准数据集的剪辑平均高出10.9个百分点。 CSP还胜过Coop,这是一种软性促进方法,可以在AUC上以平均5.8个百分点的点来微调前缀上下文令牌。我们执行其他实验,以表明CSP改善对高阶属性 - 属性对象组成(例如,旧的白猫)以及预处理属性和微调对象的组合。该代码可在https://github.com/batsresearch/csp上找到。

We introduce compositional soft prompting (CSP), a parameter-efficient learning technique to improve the zero-shot compositionality of large-scale pretrained vision-language models (VLMs) like CLIP. We develop CSP for compositional zero-shot learning, the task of predicting unseen attribute-object compositions (e.g., old cat and young tiger). VLMs have a flexible text encoder that can represent arbitrary classes as natural language prompts but they often underperform task-specific architectures on the compositional zero-shot benchmark datasets. CSP treats the attributes and objects that define classes as learnable tokens of vocabulary. During training, the vocabulary is tuned to recognize classes that compose tokens in multiple ways (e.g., old cat and white cat). At test time, we recompose the learned attribute-object vocabulary in new combinations to recognize novel classes. We show that CSP outperforms the CLIP on benchmark datasets by an average of 10.9 percentage points on AUC. CSP also outperforms CoOp, a soft prompting method that fine-tunes the prefix context tokens, by an average of 5.8 percentage points on AUC. We perform additional experiments to show that CSP improves generalization to higher-order attribute-attribute-object compositions (e.g., old white cat) and combinations of pretrained attributes and fine-tuned objects. The code is available at https://github.com/BatsResearch/csp.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源