论文标题

散射变换的稳定性,以最小的规律性变形

Stability of the scattering transform for deformations with minimal regularity

论文作者

Nicola, Fabio, Trapasso, S. Ivan

论文摘要

Within the mathematical analysis of deep convolutional neural networks, the wavelet scattering transform introduced by Stéphane Mallat is a unique example of how the ideas of multiscale analysis can be combined with a cascade of modulus nonlinearities to build a nonexpansive, translation invariant signal representation with provable geometric stability properties, namely Lipschitz continuity to the action of small $C^2$ diffeomorphisms -理论和实际目的都有一个显着的结果,它固有地取决于过滤器的选择及其对层次结构的布置。在本说明中,我们进一步研究了Hölder规律量表$ C^α$,$α> 0 $之间的散射结构与变形的规律之间的亲密关系。我们能够精确地识别稳定性阈值,证明稳定性仍然可以实现$ c^α$,$α> 1 $的变形,而不稳定性现象可以以$ c^α$,$ 0 \ leleα<1 $模型的较低规律性水平发生。尽管Lipschitz(甚至$ c^1 $)的阈值的行为仍然无法实现,但在这种情况下,我们能够证明稳定性,最多$ \ Varepsilon $损失。

Within the mathematical analysis of deep convolutional neural networks, the wavelet scattering transform introduced by Stéphane Mallat is a unique example of how the ideas of multiscale analysis can be combined with a cascade of modulus nonlinearities to build a nonexpansive, translation invariant signal representation with provable geometric stability properties, namely Lipschitz continuity to the action of small $C^2$ diffeomorphisms - a remarkable result for both theoretical and practical purposes, inherently depending on the choice of the filters and their arrangement into a hierarchical architecture. In this note, we further investigate the intimate relationship between the scattering structure and the regularity of the deformation in the Hölder regularity scale $C^α$, $α>0$. We are able to precisely identify the stability threshold, proving that stability is still achievable for deformations of class $C^α$, $α>1$, whereas instability phenomena can occur at lower regularity levels modelled by $C^α$, $0\le α<1$. While the behaviour at the threshold given by Lipschitz (or even $C^1$) regularity remains beyond reach, we are able to prove a stability bound in that case, up to $\varepsilon$ losses.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源