论文标题

帕累托流形学习:通过单任务模型的合奏解决多个任务

Pareto Manifold Learning: Tackling multiple tasks via ensembles of single-task models

论文作者

Dimitriadis, Nikolaos, Frossard, Pascal, Fleuret, François

论文摘要

在多任务学习(MTL)中,任务可能会竞争并限制彼此实现的性能,而不是将优化引导到解决方案,而优于其所有单件任务训练有素的对应物。由于通常没有一个独特的解决方案对于所有任务,因此从业者必须在任务的绩效之间平衡权衡取舍,并在帕累托意义上诉诸最佳性。大多数MTL方法要么完全忽略了这一方面,而不是旨在学习帕累托方面,而要产生一种由其优化方案预定的解决方案,或者产生多样化但离散的解决方案。最近的方法通过神经网络对帕累托阵线进行参数,从而导致从权衡到客观空间的复杂映射。在本文中,我们猜想Pareto Front在参数空间中接受了线性参数化,这使我们提出了\ textit {Pareto歧管学习},这是体重空间中的综合方法。我们的方法在一次训练中产生了连续的帕累托阵线,可以在推理过程中调节每个任务的性能。从图像分类到表格数据集和场景理解的多任务学习基准测试基准的实验,表明\ textit {Pareto歧管学习}优于最先进的单点算法,同时学习比多点式底线更好的帕累托参数化。

In Multi-Task Learning (MTL), tasks may compete and limit the performance achieved on each other, rather than guiding the optimization to a solution, superior to all its single-task trained counterparts. Since there is often not a unique solution optimal for all tasks, practitioners have to balance tradeoffs between tasks' performance, and resort to optimality in the Pareto sense. Most MTL methodologies either completely neglect this aspect, and instead of aiming at learning a Pareto Front, produce one solution predefined by their optimization schemes, or produce diverse but discrete solutions. Recent approaches parameterize the Pareto Front via neural networks, leading to complex mappings from tradeoff to objective space. In this paper, we conjecture that the Pareto Front admits a linear parameterization in parameter space, which leads us to propose \textit{Pareto Manifold Learning}, an ensembling method in weight space. Our approach produces a continuous Pareto Front in a single training run, that allows to modulate the performance on each task during inference. Experiments on multi-task learning benchmarks, ranging from image classification to tabular datasets and scene understanding, show that \textit{Pareto Manifold Learning} outperforms state-of-the-art single-point algorithms, while learning a better Pareto parameterization than multi-point baselines.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源