论文标题
蜘蛛网:通过无火车指标的混合微分进化架构搜索
SpiderNet: Hybrid Differentiable-Evolutionary Architecture Search via Train-Free Metrics
论文作者
论文摘要
神经架构搜索(NAS)算法旨在消除手动神经网络设计的负担,并已证明能够为各种知名问题设计出色的模型。但是,这些算法需要以用户配置或硬编码决策形式进行多种设计参数,这些决策限制了可以发现的各种网络。这意味着NAS算法不会消除模型设计调整,而只是改变了需要应用该调整的位置的负担。在本文中,我们介绍了Spidernet,这是一种混合可分解的进化和硬件感知算法,可快速有效地产生最先进的网络。更重要的是,Spidernet是一种最小配置的NAS算法的概念证明。在其他算法中看到的大多数设计选择都合并到Spidernet动态发展的搜索空间中,将用户选择的数量最小化为两个:还原单元格数和初始通道数。蜘蛛网与最先进的模型产生高度竞争的模型,并且在准确性,运行时,内存大小和参数计数方面超越了随机搜索。
Neural Architecture Search (NAS) algorithms are intended to remove the burden of manual neural network design, and have shown to be capable of designing excellent models for a variety of well-known problems. However, these algorithms require a variety of design parameters in the form of user configuration or hard-coded decisions which limit the variety of networks that can be discovered. This means that NAS algorithms do not eliminate model design tuning, they instead merely shift the burden of where that tuning needs to be applied. In this paper, we present SpiderNet, a hybrid differentiable-evolutionary and hardware-aware algorithm that rapidly and efficiently produces state-of-the-art networks. More importantly, SpiderNet is a proof-of-concept of a minimally-configured NAS algorithm; the majority of design choices seen in other algorithms are incorporated into SpiderNet's dynamically-evolving search space, minimizing the number of user choices to just two: reduction cell count and initial channel count. SpiderNet produces models highly-competitive with the state-of-the-art, and outperforms random search in accuracy, runtime, memory size, and parameter count.