当前位置:
X-MOL 学术
›
Comput. Methods Appl. Mech. Eng.
›
论文详情
Our official English website, www.x-mol.net, welcomes your
feedback! (Note: you will need to create a separate account there.)
Surrogate construction via weight parameterization of residual neural networks
Computer Methods in Applied Mechanics and Engineering ( IF 6.9 ) Pub Date : 2024-10-30 , DOI: 10.1016/j.cma.2024.117468 Oscar H. Diaz-Ibarra, Khachik Sargsyan, Habib N. Najm
Computer Methods in Applied Mechanics and Engineering ( IF 6.9 ) Pub Date : 2024-10-30 , DOI: 10.1016/j.cma.2024.117468 Oscar H. Diaz-Ibarra, Khachik Sargsyan, Habib N. Najm
Surrogate model development is a critical step for uncertainty quantification or other sample-intensive tasks for complex computational models. In this work we develop a multi-output surrogate form using a class of neural networks (NNs) that employ shortcut connections, namely Residual NNs (ResNets). ResNets are known to regularize the surrogate learning problem and improve the efficiency and accuracy of the resulting surrogate. Inspired by the continuous, Neural ODE analogy, we augment ResNets with weight parameterization strategy with respect to ResNet depth. Weight-parameterized ResNets regularize the NN surrogate learning problem and allow better generalization with a drastically reduced number of learnable parameters. We demonstrate that weight-parameterized ResNets are more accurate and efficient than conventional feed-forward multi-layer perceptron networks. We also compare various options for parameterization of the weights as functions of ResNet depth. We demonstrate the results on both synthetic examples and a large scale earth system model of interest.
中文翻译:
通过残差神经网络的权重参数化构建代理物
代理模型开发是复杂计算模型的不确定性量化或其他样本密集型任务的关键步骤。在这项工作中,我们使用一类采用快捷方式连接的神经网络 (NN) 开发了一种多输出代理形式,即残差 NN (ResNets)。众所周知,ResNet 可以正则化代理学习问题并提高结果代理的效率和准确性。受连续神经 ODE 类比的启发,我们用关于 ResNet 深度的权重参数化策略增强了 ResNet。权重参数化的 ResNet 正则化了 NN 代理学习问题,并允许通过大幅减少可学习参数的数量进行更好的泛化。我们证明,权重参数化的 ResNet 比传统的前馈多层感知器网络更准确、更高效。我们还比较了将权重参数化为 ResNet 深度函数的各种选项。我们在合成示例和感兴趣的大尺度地球系统模型上展示了结果。
更新日期:2024-10-30
中文翻译:
通过残差神经网络的权重参数化构建代理物
代理模型开发是复杂计算模型的不确定性量化或其他样本密集型任务的关键步骤。在这项工作中,我们使用一类采用快捷方式连接的神经网络 (NN) 开发了一种多输出代理形式,即残差 NN (ResNets)。众所周知,ResNet 可以正则化代理学习问题并提高结果代理的效率和准确性。受连续神经 ODE 类比的启发,我们用关于 ResNet 深度的权重参数化策略增强了 ResNet。权重参数化的 ResNet 正则化了 NN 代理学习问题,并允许通过大幅减少可学习参数的数量进行更好的泛化。我们证明,权重参数化的 ResNet 比传统的前馈多层感知器网络更准确、更高效。我们还比较了将权重参数化为 ResNet 深度函数的各种选项。我们在合成示例和感兴趣的大尺度地球系统模型上展示了结果。