论文标题
带有Relu激活的两层神经网络的全球通用性
A global universality of two-layer neural networks with ReLU activations
论文作者
论文摘要
在本研究中,我们研究了神经网络的通用性,该通用性涉及功能空间中两层神经网络的密度。有许多作品可以处理紧凑型集的收敛性。在本文中,我们通过适当地引入规范来考虑全球融合,以使我们的结果在任何紧凑的集合中都统一。
In the present study, we investigate a universality of neural networks, which concerns a density of the set of two-layer neural networks in a function spaces. There are many works that handle the convergence over compact sets. In the present paper, we consider a global convergence by introducing a norm suitably, so that our results will be uniform over any compact set.