当前位置: X-MOL 学术Nat. Phys. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Empowering deep neural quantum states through efficient optimization
Nature Physics ( IF 17.6 ) Pub Date : 2024-07-01 , DOI: 10.1038/s41567-024-02566-1
Ao Chen , Markus Heyl

Computing the ground state of interacting quantum matter is a long-standing challenge, especially for complex two-dimensional systems. Recent developments have highlighted the potential of neural quantum states to solve the quantum many-body problem by encoding the many-body wavefunction into artificial neural networks. However, this method has faced the critical limitation that existing optimization algorithms are not suitable for training modern large-scale deep network architectures. Here, we introduce a minimum-step stochastic-reconfiguration optimization algorithm, which allows us to train deep neural quantum states with up to 106 parameters. We demonstrate our method for paradigmatic frustrated spin-1/2 models on square and triangular lattices, for which our trained deep networks approach machine precision and yield improved variational energies compared to existing results. Equipped with our optimization algorithm, we find numerical evidence for gapless quantum-spin-liquid phases in the considered models, an open question to date. We present a method that captures the emergent complexity in quantum many-body problems through the expressive power of large-scale artificial neural networks.



中文翻译:


通过高效优化增强深度神经量子态



计算相互作用的量子物质的基态是一个长期存在的挑战,特别是对于复杂的二维系统。最近的发展凸显了神经量子态通过将多体波函数编码到人工神经网络来解决量子多体问题的潜力。然而,该方法面临着关键的限制,即现有的优化算法不适合训练现代大规模深度网络架构。在这里,我们引入了最小步随机重配置优化算法,该算法允许我们训练具有多达 10 个 6 参数的深度神经量子态。我们展示了我们在方形和三角形晶格上的典型受挫自旋 1/2 模型的方法,为此,我们训练有素的深度网络接近机器精度,并且与现有结果相比,产生了改进的变分能量。配备我们的优化算法,我们在所考虑的模型中找到了无间隙量子自旋液相的数值证据,这是迄今为止的一个悬而未决的问题。我们提出了一种通过大规模人工神经网络的表达能力来捕获量子多体问题中出现的复杂性的方法。

更新日期:2024-07-01
down
wechat
bug