论文标题

与混合量子机学习的梯度估计量不断缩放

Gradient Estimation with Constant Scaling for Hybrid Quantum Machine Learning

论文作者

Hoffmann, Thomas, Brown, Douglas

论文摘要

我们提出了一种新的方法,用于通过应用同时扰动随机近似(SPSA)算法的多变量版本来确定混合量子 - 古典机器学习模型中参数化量子电路(PQC)的梯度。与参数偏移规则的线性缩放相比,可以通过每个向前通路的每电路的两次评估的开销来计算PQC层的梯度。然后,通过应用链规则,将它们用于反向传播算法。我们将我们的方法与不同电路宽度和批量大小的参数偏移规则以及一系列学习率进行了比较。我们发现,随着Qubits数量的增加,即使考虑到每种方法的最佳学习率,我们的方法收敛的速度明显快于参数移位规则,并达到可比的精度。

We present a novel method for determining gradients of parameterised quantum circuits (PQCs) in hybrid quantum-classical machine learning models by applying the multivariate version of the simultaneous perturbation stochastic approximation (SPSA) algorithm. The gradients of PQC layers can be calculated with an overhead of two evaluations per circuit per forward-pass independent of the number of circuit parameters, compared to the linear scaling of the parameter shift rule. These are then used in the backpropagation algorithm by applying the chain rule. We compare our method to the parameter shift rule for different circuit widths and batch sizes, and for a range of learning rates. We find that, as the number of qubits increases, our method converges significantly faster than the parameter shift rule and to a comparable accuracy, even when considering the optimal learning rate for each method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源