论文标题
渐进的潜在重播,以进行有效的生成彩排
Progressive Latent Replay for efficient Generative Rehearsal
论文作者
论文摘要
我们引入了一种内部重播的新方法,该方法根据网络深度调节排练的频率。虽然重播策略减轻了神经网络中灾难性遗忘的影响,但最近对生成重播的著作表明,仅在网络的更深层次上进行排练才能改善持续学习的性能。但是,生成方法引入了其他计算开销,从而限制了其应用。通过观察到的神经网络的早期层次忘记了不断忘记的动机,我们建议在重播过程中使用中级功能更新频率不同的网络层。这通过省略了发电机的更深层和主要模型的较早层的计算来减少计算负担。我们将我们的方法命名为渐进式潜在重播,并表明它在使用更少的资源的同时,它的表现优于内部重播。
We introduce a new method for internal replay that modulates the frequency of rehearsal based on the depth of the network. While replay strategies mitigate the effects of catastrophic forgetting in neural networks, recent works on generative replay show that performing the rehearsal only on the deeper layers of the network improves the performance in continual learning. However, the generative approach introduces additional computational overhead, limiting its applications. Motivated by the observation that earlier layers of neural networks forget less abruptly, we propose to update network layers with varying frequency using intermediate-level features during replay. This reduces the computational burden by omitting computations for both deeper layers of the generator and earlier layers of the main model. We name our method Progressive Latent Replay and show that it outperforms Internal Replay while using significantly fewer resources.