281871
当前位置: 首页   >  课题组新闻   >  机器学习势的神经网络构建的方法,DeepPot-SE, SchNet, DimeNet, sGDML, PaiNN, SpookyNet, GemNet, NewtonNet, UNiTE, NequIP, and so on
机器学习势的神经网络构建的方法,DeepPot-SE, SchNet, DimeNet, sGDML, PaiNN, SpookyNet, GemNet, NewtonNet, UNiTE, NequIP, and so on
发布时间:2023-03-22

GitHub - hsulab/GDPy: Generating Deep Potential with Python https://github.com/hsulab/GDPy

GDPy stands for Generating Deep Potential with Python (GDPy/GDP¥), including a set of tools and Python modules to automate the structure exploration and the training for machine learning interatomic potentials (MLIPs).

It mainly focuses on the applications in heterogeneous catalysis. The target systems are metal oxides, supported clusters, and solid-liquid interfaces.


GDML[15] It is a machine learning algorithm that uses Gaussian Process Regression to model the potential energy surface (PES) of a molecular system. In contrast to GNNs, GDML does not operate on a graph structure but rather on a set of input features that describe the molecular system.


based on graph neural networks (GNNs):DeepPot-SE, SchNet, ANI, DimeNet, GemNet and NequIP [114]


DeepPot-SE: based on graph convolutional network(GCN) and a long short-term memory (LSTM) network to learn the potential energy surface of a molecular system.


ANI (Atomic Neural Network):  based on a graph neural network (GNN) architecture. ANI uses a graph-based representation of molecules where the atoms in the molecule form the nodes of the graph, and the bonds between them form the edges. This representation allows the ANI model to capture the spatial relationships between atoms in the molecule. In addition to the graph representation, ANI also employs a message passing algorithm. This algorithm updates the state of each node in the graph based on the state of its neighbors, allowing ANI to learn complex interactions between atoms in the molecule.


SchNet uses a continuous-filter convolutional neural network, which can be seen as a variant of a graph convolutional network (GCN). This network architecture allows SchNet to learn complex representations of the local atomic environments in a molecule. Additionally, SchNet also incorporates a pairwise interaction term that considers the interactions between all pairs of atoms in the molecule.


reann(Recursively embedded atom neural network) [16]


  1. L. Zhang, J. Han, H. Wang, W.A. Saidi, R. Car, E. Weinan, End-to-end symmetry preserving inter-atomic potential energy model for finite and extended systems, NIPS’18: Proceedings of the 32nd International Conference on Neural Information Processing Systems, Curran Associates Inc., Red Hook, NY, USA, 2018, pp. 44414451.

  2. Schütt K, Unke O and Gastegger M 2021 Equivariant message passing for the prediction of tensorial properties and molecular spectra Int. Conf. on Machine Learning 

  3. Gasteiger J, Becker F and Günnemann S 2021 GemNet: universal directional graph neural networks for molecules Neural Information Processing Systems 

  4.  Batzner S, Musaelian A, Sun L, Geiger M, Mailoa J P, Kornbluth M, Molinari N, Smidt T E and Kozinsky B 2022 E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials Nat. Commun. 13 2453 

  5.  Unke O T, Chmiela S, Gastegger M, Schütt K T, Sauceda H E and Müller K-R 2021 SpookyNet: learning force fields with electronic degrees of freedom and nonlocal effects Nat. Commun. 12 7273 

  6.  Gasteiger J, Groß J and Günnemann S 2020 Directional message passing for molecular graphs Int. Conf. on Learning Representations (ICLR) 

  7. Gasteiger J, Giri S, Margraf J T and Günnemann S 2020 Fast and uncertainty-aware directional message passing for non-equilibrium molecules Machine Learning for Molecules Workshop (NeurIPS)

  8. Park C W, Kornbluth M, Vandermause J, Wolverton C, Kozinsky B and Mailoa J P 2020 Accurate and scalable multi-element graph neural network force field and molecular dynamics with direct force architecture (arXiv:2007.14444

  9.  Unke O T and Meuwly M 2019 PhysNet: a neural network for predicting energies, forces, dipole moments and partial charges J. Chem. Theory Comput. 15 3678–93 

  10.  Schütt K T, Kindermans P-J, Sauceda H E, Chmiela S, Tkatchenko A and Müller K-R 2017 SchNet: a continuous-filter convolutional neural network for modeling quantum interactions Neural Information Processing Systems 

  11.  Schütt K T, Arbabzadah F, Chmiela S, Müller K R and Tkatchenko A 2017 Quantum-chemical insights from deep tensor neural networks Nat. Commun. 8 13890 

  12.  Schütt K T, Kessel P, Gastegger M, Nicoli K A, Tkatchenko A and Müller K-R 2019 SchNetPack: a deep learning toolbox for atomistic systems J. Chem. Theory Comput. 15 448–55 

  13.  Zubatyuk R, Smith J S, Leszczynski J and Isayev O 2019 Accurate and transferable multitask prediction of chemical properties with an atoms-in-molecules neural network Sci. Adv. 5 eaav6490 

  14. J.S. Smith, O. Isayev, A.E. Roitberg, ANI-1: an extensible neural network potential with DFT accuracy at force field computational cost, Chem. Sci. 8 (2017) 31923203.

  15. Chmiela, S. et al. Machine learning of accurate energy-conserving molecular force fields. Science Advances 3, e1603015 (2017). GDML

    1. The original EANN model: Yaolong Zhang, Ce Hu and Bin Jiang J. Phys. Chem. Lett. 10, 4962-4967 (2019).
    2. The EANN model for dipole/transition dipole/polarizability: Yaolong Zhang Sheng Ye, Jinxiao Zhang, Jun Jiang and Bin Jiang J. Phys. Chem. B 124, 7284–7290 (2020).
    3. The theory of REANN model: Yaolong Zhang, Junfan Xia and Bin Jiang Phys. Rev. Lett. 127, 156002 (2021).
    4. The details about the implementation of REANN: Yaolong Zhang, Junfan Xia and Bin Jiang J. Chem. Phys. 156, 114801 (2022).