当前位置: X-MOL 学术Appl. Comput. Harmon. Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Solving PDEs on unknown manifolds with machine learning
Applied and Computational Harmonic Analysis ( IF 2.6 ) Pub Date : 2024-02-29 , DOI: 10.1016/j.acha.2024.101652
Senwei Liang , Shixiao W. Jiang , John Harlim , Haizhao Yang

This paper proposes a mesh-free computational framework and machine learning theory for solving elliptic PDEs on unknown manifolds, identified with point clouds, based on diffusion maps (DM) and deep learning. The PDE solver is formulated as a supervised learning task to solve a least-squares regression problem that imposes an algebraic equation approximating a PDE (and boundary conditions if applicable). This algebraic equation involves a graph-Laplacian type matrix obtained via DM asymptotic expansion, which is a consistent estimator of second-order elliptic differential operators. The resulting numerical method is to solve a highly non-convex empirical risk minimization problem subjected to a solution from a hypothesis space of neural networks (NNs). In a well-posed elliptic PDE setting, when the hypothesis space consists of neural networks with either infinite width or depth, we show that the global minimizer of the empirical loss function is a consistent solution in the limit of large training data. When the hypothesis space is a two-layer neural network, we show that for a sufficiently large width, gradient descent can identify a global minimizer of the empirical loss function. Supporting numerical examples demonstrate the convergence of the solutions, ranging from simple manifolds with low and high co-dimensions, to rough surfaces with and without boundaries. We also show that the proposed NN solver can robustly generalize the PDE solution on new data points with generalization errors that are almost identical to the training errors, superseding a Nyström-based interpolation method.

中文翻译:

使用机器学习求解未知流形上的偏微分方程

本文提出了一种无网格计算框架和机器学习理论,用于求解未知流形上的椭圆偏微分方程,并基于扩散图 (DM) 和深度学习,通过点云进行识别。 PDE 求解器被表述为监督学习任务,用于解决最小二乘回归问题,该问题施加逼近 PDE 的代数方程(以及边界条件,如果适用)。该代数方程涉及通过DM渐近展开获得的图拉普拉斯型矩阵,它是二阶椭圆微分算子的一致估计。由此产生的数值方法是根据神经网络(NN)假设空间的解决方案来解决高度非凸的经验风险最小化问题。在适定椭圆偏微分方程设置中,当假设空间由无限宽度或无限深度的神经网络组成时,我们表明经验损失函数的全局极小值在大训练数据的限制下是一致的解决方案。当假设空间是两层神经网络时,我们表明,对于足够大的宽度,梯度下降可以识别经验损失函数的全局极小值。支持的数值示例证明了解决方案的收敛性,范围从具有低和高维数的简单流形到有边界和无边界的粗糙表面。我们还表明,所提出的 NN 求解器可以在新数据点上鲁棒地泛化 PDE 解,泛化误差几乎与训练误差相同,取代了基于 Nyström 的插值方法。
更新日期:2024-02-29
down
wechat
bug