论文标题

学会结合和发展神经结构

Learn to Bind and Grow Neural Structures

论文作者

Shaikh, Azhar, Sinha, Nishant

论文摘要

任务收入学习涉及不断学习新任务的挑战性问题,而无需忘记过去的知识。随着任务的到来,许多方法通过扩大共享神经网络的结构来解决问题,但努力在不失去过去知识的情况下进行最佳成长。我们提出了一个新的框架,学习绑定和成长,该框架通过与类似任务的层或扩展更可能在任务之间发生冲突的图层来逐步学习新任务的神经体系结构。我们方法的核心是共享多任务架构空间的一种新颖,可解释的参数化,然后可以使用贝叶斯优化来计算全球最佳体系结构。持续学习基准测试的实验表明,我们的框架以较早的基于扩展的方法的性能相当,并且能够灵活地计算出具有性能大小的权衡的多个最佳解决方案。

Task-incremental learning involves the challenging problem of learning new tasks continually, without forgetting past knowledge. Many approaches address the problem by expanding the structure of a shared neural network as tasks arrive, but struggle to grow optimally, without losing past knowledge. We present a new framework, Learn to Bind and Grow, which learns a neural architecture for a new task incrementally, either by binding with layers of a similar task or by expanding layers which are more likely to conflict between tasks. Central to our approach is a novel, interpretable, parameterization of the shared, multi-task architecture space, which then enables computing globally optimal architectures using Bayesian optimization. Experiments on continual learning benchmarks show that our framework performs comparably with earlier expansion based approaches and is able to flexibly compute multiple optimal solutions with performance-size trade-offs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源