论文标题

使用元学习的热启动飞镖

Warm-starting DARTS using meta-learning

论文作者

Grobelnik, Matej, Vanschoren, Joaquin

论文摘要

神经体系结构搜索(NAS)在自动化机器学习(AUTOML)领域表现出了巨大的希望。 NAS表现优于手工设计的网络,并在自动化深度神经网络的设计领域迈出了重要的一步,从而进一步减少了对人类专业知识的需求。但是,大多数研究都是针对单个特定任务的,而对大多数被忽略的多个任务的研究都将研究。通常,有两种流行的方法可以找到某种新任务的建筑。从头开始搜索,该搜索是由于设计而无效,要么从其他任务中转移发现的架构,这不提供性能保证,并且可能不是最佳的。在这项工作中,我们提出了一个元学习框架,以进行温暖启动的可区分架构搜索(DARTS)。飞镖是一种NAS方法,可以使用传输的体系结构初始化,并能够快速适应新任务。任务相似性度量用于确定选择哪些传输体系结构,因为在相似任务上发现的传输体系结构可能会更好。此外,我们采用了一种简单的元转移体系结构,该体系结构是通过多个任务学习的。实验表明,温暖启动的飞镖能够找到具有竞争力的架构,同时平均将搜索成本降低60%。

Neural architecture search (NAS) has shown great promise in the field of automated machine learning (AutoML). NAS has outperformed hand-designed networks and made a significant step forward in the field of automating the design of deep neural networks, thus further reducing the need for human expertise. However, most research is done targeting a single specific task, leaving research of NAS methods over multiple tasks mostly overlooked. Generally, there exist two popular ways to find an architecture for some novel task. Either searching from scratch, which is ineffective by design, or transferring discovered architectures from other tasks, which provides no performance guarantees and is probably not optimal. In this work, we present a meta-learning framework to warm-start Differentiable architecture search (DARTS). DARTS is a NAS method that can be initialized with a transferred architecture and is able to quickly adapt to new tasks. A task similarity measure is used to determine which transfer architecture is selected, as transfer architectures found on similar tasks will likely perform better. Additionally, we employ a simple meta-transfer architecture that was learned over multiple tasks. Experiments show that warm-started DARTS is able to find competitive performing architectures while reducing searching costs on average by 60%.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源