当前位置: X-MOL 学术Quantum Sci. Technol. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Permutation-equivariant quantum convolutional neural networks
Quantum Science and Technology ( IF 5.6 ) Pub Date : 2024-11-15 , DOI: 10.1088/2058-9565/ad8e80
Sreetama Das and Filippo Caruso

The Symmetric group Sn manifests itself in large classes of quantum systems as the invariance of certain characteristics of a quantum state with respect to permuting the qubits. Subgroups of Sn arise, among many other contexts, to describe label symmetry of classical images with respect to spatial transformations, such as reflection or rotation. Equipped with the formalism of geometric quantum machine learning, in this study we propose the architectures of equivariant quantum convolutional neural networks (EQCNNs) adherent to Sn and its subgroups. We demonstrate that a careful choice of pixel-to-qubit embedding order can facilitate easy construction of EQCNNs for small subgroups of Sn. Our novel EQCNN architecture corresponding to the full permutation group Sn is built by applying all possible QCNNs with equal probability, which can also be conceptualized as a dropout strategy in quantum neural networks. For subgroups of Sn, our numerical results using MNIST datasets show better classification accuracy than non-equivariant QCNNs. The Sn-equivariant QCNN architecture shows significantly improved training and test performance than non-equivariant QCNN for classification of connected and non-connected graphs. When trained with sufficiently large number of data, the Sn-equivariant QCNN shows better average performance compared to Sn-equivariant QNN . These results contribute towards building powerful quantum machine learning architectures in permutation-symmetric systems.

中文翻译:


排列等变量子卷积神经网络



对称群 Sn 在大类量子系统中表现为量子状态的某些特征在排列量子比特方面的不变性。在许多其他上下文中,Sn 的子群用于描述经典图像在空间变换(例如反射或旋转)方面的标签对称性。在这项研究中,我们配备了几何量子机器学习的形式主义,提出了依附于 Sn 及其子群的等变量子卷积神经网络 (EQCNN) 的架构。我们证明,仔细选择像素到量子比特嵌入顺序可以促进为 Sn 的小子群轻松构建 EQCNN。我们对应于完整排列群 Sn 的新型 EQCNN 架构是通过以相等的概率应用所有可能的 QCNN 构建的,这也可以概念化为量子神经网络中的 dropout 策略。对于 Sn 的亚群,我们使用 MNIST 数据集的数值结果显示比非等变 QCNN 更好的分类准确性。在连通图和不连通图的分类方面,Sn 等变 QCNN 架构显示出比非等变 QCNN 显着改进的训练和测试性能。当使用足够多的数据进行训练时,与 Sn 等变 QNN 相比,Sn 等变 QCNN 显示出更好的平均性能。这些结果有助于在排列对称系统中构建强大的量子机器学习架构。
更新日期:2024-11-15
down
wechat
bug