论文标题

部分可观测时空混沌系统的无模型预测

Budget-Aware Pruning for Multi-Domain Learning

论文作者

Santos, Samuel Felipe dos, Berriel, Rodrigo, Oliveira-Santos, Thiago, Sebe, Nicu, Almeida, Jurandy

论文摘要

深度学习已经在几个计算机视觉任务和域上实现了最先进的表现。然而,它仍然具有高计算成本,需要大量参数。这种要求阻碍了在资源有限的环境中的使用,并要求软件和硬件优化。另一个限制是,深层模型通常被专门用于单个域或任务,要求它们为每个新参数学习和存储新参数。多域学习(MDL)试图通过学习能够在多个领域中表现良好的单个模型来解决这个问题。然而,这些模型通常大于单个域的基线。这项工作解决了这两个问题:我们的目标是修剪能够根据用户定义的预算来处理多个域的模型,从而使它们在计算上更加负担得起,同时保持类似的分类性能。我们通过鼓励所有域使用基线模型中的过滤器类似的过滤器来实现这一目标,达到用户预算定义的数量。然后,任何域未使用的过滤器都会从网络上修剪。拟议的方法通过更好地适应资源有限的设备来创新,而据我们所知,是唯一能够在测试时间处理多个域的参数和计算复杂性较低的工作,而不是单个域的基线模型。

Deep learning has achieved state-of-the-art performance on several computer vision tasks and domains. Nevertheless, it still has a high computational cost and demands a significant amount of parameters. Such requirements hinder the use in resource-limited environments and demand both software and hardware optimization. Another limitation is that deep models are usually specialized into a single domain or task, requiring them to learn and store new parameters for each new one. Multi-Domain Learning (MDL) attempts to solve this problem by learning a single model that is capable of performing well in multiple domains. Nevertheless, the models are usually larger than the baseline for a single domain. This work tackles both of these problems: our objective is to prune models capable of handling multiple domains according to a user defined budget, making them more computationally affordable while keeping a similar classification performance. We achieve this by encouraging all domains to use a similar subset of filters from the baseline model, up to the amount defined by the user's budget. Then, filters that are not used by any domain are pruned from the network. The proposed approach innovates by better adapting to resource-limited devices while, to our knowledge, being the only work that is capable of handling multiple domains at test time with fewer parameters and lower computational complexity than the baseline model for a single domain.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源