论文标题
通用的超模式多层感知
Universal Equivariant Multilayer Perceptrons
论文作者
论文摘要
群体不变的多层多层感知器(MLP),也称为eproivariant网络,在学习各种数据结构(例如序列,图像,集合和图形)方面取得了巨大的成功。本文使用小组理论中的工具证明了具有单个隐藏层的广泛类似于MLP的普遍性。特别是,有一个隐藏的层,该层定期起作用,足以实现通用均值(不变性)。推论是Abelian群体(例如具有单个隐藏层的CNN)的均值MLP的无条件普遍性。第二个推论是具有高阶隐藏层的均衡MLP的普遍性,在其中,我们给出了在隐藏层上计算特定于群体特定界限的群体不合时式界限,并提供了保证通用均值(不变性)的群体特定界限。
Group invariant and equivariant Multilayer Perceptrons (MLP), also known as Equivariant Networks, have achieved remarkable success in learning on a variety of data structures, such as sequences, images, sets, and graphs. Using tools from group theory, this paper proves the universality of a broad class of equivariant MLPs with a single hidden layer. In particular, it is shown that having a hidden layer on which the group acts regularly is sufficient for universal equivariance (invariance). A corollary is unconditional universality of equivariant MLPs for Abelian groups, such as CNNs with a single hidden layer. A second corollary is the universality of equivariant MLPs with a high-order hidden layer, where we give both group-agnostic bounds and means for calculating group-specific bounds on the order of hidden layer that guarantees universal equivariance (invariance).