论文标题

通过自指元学习消除元优化

Eliminating Meta Optimization Through Self-Referential Meta Learning

论文作者

Kirsch, Louis, Schmidhuber, Jürgen

论文摘要

元学习可以自动搜索学习算法。同时,它在元级别上建立了对人类工程的依赖,需要设计元学习算法。在本文中,我们研究了自我指指的元学习系统,这些学习系统不需要明确的元优化。我们讨论了此类系统与基于内存和基于内存的元学习的关系,并表明自指的神经网络需要以参数共享的形式重复使用功能。最后,我们提出了健身单调执行(FME),这是一种避免明确元优化的简单方法。神经网络自我修改以解决强盗和经典控制任务,改善其自我修复,并学习如何学习,纯粹是通过分配更多的计算资源来更好地执行解决方案。

Meta Learning automates the search for learning algorithms. At the same time, it creates a dependency on human engineering on the meta-level, where meta learning algorithms need to be designed. In this paper, we investigate self-referential meta learning systems that modify themselves without the need for explicit meta optimization. We discuss the relationship of such systems to in-context and memory-based meta learning and show that self-referential neural networks require functionality to be reused in the form of parameter sharing. Finally, we propose fitness monotonic execution (FME), a simple approach to avoid explicit meta optimization. A neural network self-modifies to solve bandit and classic control tasks, improves its self-modifications, and learns how to learn, purely by assigning more computational resources to better performing solutions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源