当前位置: X-MOL 学术Nat. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Laplace neural operator for solving differential equations
Nature Machine Intelligence ( IF 18.8 ) Pub Date : 2024-06-24 , DOI: 10.1038/s42256-024-00844-4
Qianying Cao , Somdatta Goswami , George Em Karniadakis

Neural operators map multiple functions to different functions, possibly in different spaces, unlike standard neural networks. Hence, neural operators allow the solution of parametric ordinary differential equations (ODEs) and partial differential equations (PDEs) for a distribution of boundary or initial conditions and excitations, but can also be used for system identification as well as designing various components of digital twins. We introduce the Laplace neural operator (LNO), which incorporates the pole–residue relationship between input–output spaces, leading to better interpretability and generalization for certain classes of problems. The LNO is capable of processing non-periodic signals and transient responses resulting from simultaneously zero and non-zero initial conditions, which makes it achieve better approximation accuracy over other neural operators for extrapolation circumstances in solving several ODEs and PDEs. We also highlight the LNO’s good interpolation ability, from a low-resolution input to high-resolution outputs at arbitrary locations within the domain. To demonstrate the scalability of LNO, we conduct large-scale simulations of Rossby waves around the globe, employing millions of degrees of freedom. Taken together, our findings show that a pretrained LNO model offers an effective real-time solution for general ODEs and PDEs at scale and is an efficient alternative to existing neural operators.



中文翻译:


用于求解微分方程的拉普拉斯神经算子



与标准神经网络不同,神经算子将多个函数映射到不同的函数,可能位于不同的空间。因此,神经算子允许求解边界或初始条件和激励分布的参数常微分方程 (ODE) 和偏微分方程 (PDE),但也可用于系统识别以及设计数字孪生的各种组件。我们引入了拉普拉斯神经算子(LNO),它结合了输入输出空间之间的极点-留数关系,从而为某些类别的问题带来更好的可解释性和泛化性。 LNO 能够处理由同时零和非零初始条件产生的非周期信号和瞬态响应,这使得它在求解多个 ODE 和 PDE 的外推情况下比其他神经算子获得更好的逼近精度。我们还强调了 LNO 良好的插值能力,从低分辨率输入到域内任意位置的高分辨率输出。为了证明 LNO 的可扩展性,我们使用数百万个自由度在全球范围内对罗斯贝波进行了大规模模拟。总而言之,我们的研究结果表明,预训练的 LNO 模型为大规模的一般 ODE 和 PDE 提供了有效的实时解决方案,并且是现有神经算子的有效替代方案。

更新日期:2024-06-25
down
wechat
bug