论文标题

部分可观测时空混沌系统的无模型预测

Prompt Tuning with Soft Context Sharing for Vision-Language Models

论文作者

Ding, Kun, Wang, Ying, Liu, Pengzhang, Yu, Qiang, Zhang, Haojian, Xiang, Shiming, Pan, Chunhong

论文摘要

视觉语言模型最近在计算机视觉中的许多任务上都表现出了巨大的潜力。同时,与线性探针相比,先前的工作证明了为视觉模型设计的及时调整,可以在几乎没有图像识别的情况下获得出色的性能。在实践中,许多射击任务本质上是相关的,尤其是在专业领域内。但是,此类信息以前被忽略了。受到多任务学习的建模任务关系通常可以提高性能的启发,我们提出了一种新颖的方法softcpt(迅速调整的软上下文共享),以在多个目标几个目标上共同调整预训练的视觉模型。具体来说,我们设计了一个任务共享的元网络,以使用任务名称以及可学习的任务上下文作为输入来为每个任务生成提示上下文。该元网络的参数以及任务上下文都在所有任务的联合培训集中调整。因此,所有任务的及时上下文将以软的方式共享。涵盖44个任务和1593个类别的四个多任务少量数据集的大量实验表明,SoftCPT显着胜过单件任务及时调整方法,突出了多任务学习对视觉启动及时及时调整的有效性。代码可在https://github.com/kding1225/softcpt上找到。

Vision-language models have recently shown great potential on many tasks in computer vision. Meanwhile, prior work demonstrates prompt tuning designed for vision-language models could acquire superior performance on few-shot image recognition compared to linear probe, a strong baseline. In practice, many few-shot tasks are inherently correlated, particularly within specialized domains. However, such information is overlooked previously. Inspired by the fact that modeling task relationship by multi-task learning can usually boost performance, we propose a novel method SoftCPT (Soft Context Sharing for Prompt Tuning) to tune pre-trained vision-language models on multiple target few-shot tasks jointly. Specifically, we design a task-shared meta network to generate prompt context for each task using task name together with a learnable task context as input. The parameters of this meta network as well as the task context are tuned on the joint training set of all tasks. As such, the prompt context of all tasks will be shared in a soft manner. Extensive experiments across four multi-task few-shot datasets covering 44 tasks and 1593 categories demonstrate that SoftCPT significantly outperforms single-task prompt tuning methods, highlighting the effectiveness of multi-task learning for vision-language prompt tuning. Code is available at https://github.com/kding1225/softcpt.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源