论文标题

关于弯曲的超平面布置的横向性和Relu神经网络的拓扑表达

On transversality of bent hyperplane arrangements and the topological expressiveness of ReLU neural networks

论文作者

Grigsby, J. Elisenda, Lindsey, Kathryn

论文摘要

令F:r^n-> r为馈电relu神经网络。众所周知,对于任何选择参数,F是连续且分段(仿射)线性的。我们为系统调查F架构如何影响其可能的决策区域对二进制分类任务的几何形状和拓扑结构进行系统研究。遵循差异拓扑中平滑函数的经典进展,我们首先定义了通用,横向依赖神经网络的概念,并表明几乎所有的Relu网络都是通用和横向的。然后,我们在F域中定义一个部分面向的线性1复合物,并确定该复合物的特性,该特性会妨碍决策区域的有界连接组件的存在。我们使用这种障碍物证明,具有单个尺寸(n + 1)的单个隐藏层的通用,横向relu网络F:r^n-> r的决策区域不得超过一个有界的连接组件。

Let F:R^n -> R be a feedforward ReLU neural network. It is well-known that for any choice of parameters, F is continuous and piecewise (affine) linear. We lay some foundations for a systematic investigation of how the architecture of F impacts the geometry and topology of its possible decision regions for binary classification tasks. Following the classical progression for smooth functions in differential topology, we first define the notion of a generic, transversal ReLU neural network and show that almost all ReLU networks are generic and transversal. We then define a partially-oriented linear 1-complex in the domain of F and identify properties of this complex that yield an obstruction to the existence of bounded connected components of a decision region. We use this obstruction to prove that a decision region of a generic, transversal ReLU network F: R^n -> R with a single hidden layer of dimension (n + 1) can have no more than one bounded connected component.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源