论文标题
较大的神经网络是否意味着更大的信息传输效率?
Does a larger neural network mean greater information transmission efficiency?
论文作者
论文摘要
大脑的现实建模涉及大量神经元。重要的问题是这种尺寸如何影响传输效率?在这里,这些问题是根据香农的理论来研究的。在理论上和数值上分析了越来越多的神经元网络类别的输入和输出信号之间的共同信息。采用征税的神经模型。事实证明,对于这些网络而言,相互信息会收敛,大小的增加,渐近地非常缓慢至饱和水平。这表明从一定级别开始,神经元数的增加并不意味着传输效率的显着提高,而是对可靠性的贡献。
Realistic modeling of brain involves large number of neurons. The important question is how this size affects transmission efficiency? Here, these issue is studied in terms of Shannon's Theory. Mutual Information between input and output signals for simple class of networks with an increasing number of neurons is analyzed theoretically and numerically. Levy-Baxter neural model is applied. It turned out, that for these networks the Mutual Information converges, with increasing size, asymptotically very slowly to saturation level. This suggests that from certain level, the increase of neurons number does not imply significant increase in transmission efficiency, contributes rather to reliability.