论文标题

评估线性动力学系统上物理信息的神经网络绑定的误差

Evaluating Error Bound for Physics-Informed Neural Networks on Linear Dynamical Systems

论文作者

Liu, Shuheng, Huang, Xiyue, Protopapas, Pavlos

论文摘要

关于使用物理信息的神经网络解决微分方程的广泛研究。尽管这种方法在许多情况下已被证明是有利的,但主要批评在于它缺乏分析误差范围。因此,它不如传统的同行(例如有限差异方法)可信。本文表明,可以在数学上得出在一类微分方程的线性系统上训练的物理信息的神经网络的明确误差界限。更重要的是,评估此类误差范围仅需要评估感兴趣域上的微分方程残留无限规范。我们的工作显示了网络残差之间的联系,该网络残差被称为损耗函数,以及解决方案的绝对误差,这通常是未知的。我们的方法是半光学论,独立于对网络的实际解决方案或复杂性或架构的了解。使用在线性ODE和线性ODES系统上制造解决方案的方法,我们从经验上验证了错误评估算法,并证明实际误差严格在我们派生的界限内。

There have been extensive studies on solving differential equations using physics-informed neural networks. While this method has proven advantageous in many cases, a major criticism lies in its lack of analytical error bounds. Therefore, it is less credible than its traditional counterparts, such as the finite difference method. This paper shows that one can mathematically derive explicit error bounds for physics-informed neural networks trained on a class of linear systems of differential equations. More importantly, evaluating such error bounds only requires evaluating the differential equation residual infinity norm over the domain of interest. Our work shows a link between network residuals, which is known and used as loss function, and the absolute error of solution, which is generally unknown. Our approach is semi-phenomonological and independent of knowledge of the actual solution or the complexity or architecture of the network. Using the method of manufactured solution on linear ODEs and system of linear ODEs, we empirically verify the error evaluation algorithm and demonstrate that the actual error strictly lies within our derived bound.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源