当前位置: X-MOL 学术npj Quantum Inform. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Theoretical guarantees for permutation-equivariant quantum neural networks
npj Quantum Information ( IF 6.6 ) Pub Date : 2024-01-22 , DOI: 10.1038/s41534-024-00804-1
Louis Schatzki , Martín Larocca , Quynh T. Nguyen , Frédéric Sauvage , M. Cerezo

Despite the great promise of quantum machine learning models, there are several challenges one must overcome before unlocking their full potential. For instance, models based on quantum neural networks (QNNs) can suffer from excessive local minima and barren plateaus in their training landscapes. Recently, the nascent field of geometric quantum machine learning (GQML) has emerged as a potential solution to some of those issues. The key insight of GQML is that one should design architectures, such as equivariant QNNs, encoding the symmetries of the problem at hand. Here, we focus on problems with permutation symmetry (i.e., symmetry group Sn), and show how to build Sn-equivariant QNNs We provide an analytical study of their performance, proving that they do not suffer from barren plateaus, quickly reach overparametrization, and generalize well from small amounts of data. To verify our results, we perform numerical simulations for a graph state classification task. Our work provides theoretical guarantees for equivariant QNNs, thus indicating the power and potential of GQML.



中文翻译:

排列等变量子神经网络的理论保证

尽管量子机器学习模型前景广阔,但在释放其全部潜力之前还必须克服一些挑战。例如,基于量子神经网络 (QNN) 的模型可能会在训练环境中遭受过多的局部极小值和贫瘠平台的影响。最近,几何量子机器学习(GQML)这一新兴领域的出现成为其中一些问题的潜在解决方案。GQML 的关键见解是,人们应该设计架构,例如等变 QNN,对当前问题的对称性进行编码。在这里,我们重点关注排列对称性(即对称群S n)的问题,并展示如何构建S n等变 QNN。我们提供了对其性能的分析研究,证明它们不会遭受贫瘠的高原,快速达到过度参数化,并且可以从少量数据中很好地概括。为了验证我们的结果,我们对图状态分类任务进行数值模拟。我们的工作为等变 QNN 提供了理论保证,从而表明了 GQML 的力量和潜力。

更新日期:2024-01-23
down
wechat
bug