论文标题

依从神经网络模型选择的统计模型方法

A Statistical-Modelling Approach to Feedforward Neural Network Model Selection

论文作者

McInerney, Andrew, Burke, Kevin

论文摘要

馈电神经网络(FNN)可以看作是非线性回归模型,在该模型中,协变量通过加权求和和非线性函数的组合进入模型。尽管这些模型与统计建模中使用的方法有一些相似之处,但大多数神经网络研究是在统计领域之外进行的。这导致缺乏基于统计学的方法,尤其是对模型简约的重视。确定输入层结构类似于变量选择,而隐藏层的结构与模型复杂性有关。在实践中,通常通过使用样本外部性能比较模型来进行神经网络模型选择。但是,相反,相关的似然函数的构建为基于信息标准的变量和体系结构选择打开了大门。使用贝叶斯信息标准(BIC)提出了一种同时执行输入和隐藏节点选择的新型模型选择方法。作为模型选择目标函数,BIC在样本外的性能上的选择会导致恢复真实模型的概率增加,同时相同地实现了有利的样本外部性能。模拟研究用于评估和证明所提出的方法,并研究了实际数据的应用。

Feedforward neural networks (FNNs) can be viewed as non-linear regression models, where covariates enter the model through a combination of weighted summations and non-linear functions. Although these models have some similarities to the approaches used within statistical modelling, the majority of neural network research has been conducted outside of the field of statistics. This has resulted in a lack of statistically-based methodology, and, in particular, there has been little emphasis on model parsimony. Determining the input layer structure is analogous to variable selection, while the structure for the hidden layer relates to model complexity. In practice, neural network model selection is often carried out by comparing models using out-of-sample performance. However, in contrast, the construction of an associated likelihood function opens the door to information-criteria-based variable and architecture selection. A novel model selection method, which performs both input- and hidden-node selection, is proposed using the Bayesian information criterion (BIC) for FNNs. The choice of BIC over out-of-sample performance as the model selection objective function leads to an increased probability of recovering the true model, while parsimoniously achieving favourable out-of-sample performance. Simulation studies are used to evaluate and justify the proposed method, and applications on real data are investigated.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源