论文标题

通过隐式差异化辅助学习

Auxiliary Learning by Implicit Differentiation

论文作者

Navon, Aviv, Achituve, Idan, Maron, Haggai, Chechik, Gal, Fetaya, Ethan

论文摘要

通过辅助任务培训神经网络是提高主要兴趣任务的表现的普遍做法。在这个多任务学习环境中出现了两个主要挑战:(i)设计有用的辅助任务; (ii)将辅助任务组合为单一连贯的损失。在这里,我们提出了一个新颖的框架Auxilearn,该框架针对基于隐式差异的两个挑战。首先,当知道有用的辅助设备时,我们建议学习一个将所有损失结合到单个连贯的目标函数中的网络。该网络可以学习任务之间的非线性交互。其次,当不知道有用的辅助任务时,我们描述了如何学习产生有意义的新型辅助任务的网络。我们在一系列任务和域中评估了Auxilearn,包括图像分割和学习属性在低数据制度中的属性,并发现它始终优于竞争方法。

Training neural networks with auxiliary tasks is a common practice for improving the performance on a main task of interest. Two main challenges arise in this multi-task learning setting: (i) designing useful auxiliary tasks; and (ii) combining auxiliary tasks into a single coherent loss. Here, we propose a novel framework, AuxiLearn, that targets both challenges based on implicit differentiation. First, when useful auxiliaries are known, we propose learning a network that combines all losses into a single coherent objective function. This network can learn non-linear interactions between tasks. Second, when no useful auxiliary task is known, we describe how to learn a network that generates a meaningful, novel auxiliary task. We evaluate AuxiLearn in a series of tasks and domains, including image segmentation and learning with attributes in the low data regime, and find that it consistently outperforms competing methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源