论文标题
潜在组结构化多任务学习
Latent Group Structured Multi-task Learning
论文作者
论文摘要
在多任务学习(MTL)中,我们通过共同培训各种任务来提高密钥机学习算法的性能。当任务数量较大时,建模任务结构可以进一步完善任务关系模型。例如,通常可以根据元数据或通过简单的预处理步骤(例如K-Means)对任务进行分组。在本文中,我们介绍了我们的小组结构化潜在空间多任务学习模型,该模型鼓励由先验信息定义的组结构任务。我们使用交替的最小化方法来学习模型参数。实验是在合成和现实世界数据集上进行的,显示了单任务学习(分别对每个组进行训练)和其他MTL基线的竞争性能。
In multi-task learning (MTL), we improve the performance of key machine learning algorithms by training various tasks jointly. When the number of tasks is large, modeling task structure can further refine the task relationship model. For example, often tasks can be grouped based on metadata, or via simple preprocessing steps like K-means. In this paper, we present our group structured latent-space multi-task learning model, which encourages group structured tasks defined by prior information. We use an alternating minimization method to learn the model parameters. Experiments are conducted on both synthetic and real-world datasets, showing competitive performance over single-task learning (where each group is trained separately) and other MTL baselines.