论文标题

用于解决随机冷冻湖泊环境的量子加固学习和量子体系结构的影响

Quantum Reinforcement Learning for Solving a Stochastic Frozen Lake Environment and the Impact of Quantum Architecture Choices

论文作者

Drăgan, Theodora-Augustina, Monnet, Maureen, Mendl, Christian B., Lorenz, Jeanette Miriam

论文摘要

量子增强学习(QRL)模型以量子增强的内核增强经典的增强学习方案。关于如何构建此类模型的不同建议在经验上表现出有希望的表现。特别是,这些模型可能提供的参数计数和较短的时间比经典模型提供了较短的时间。但是,目前尚不清楚这些量子增强的内核是如何在加固学习管道中构建的子例程,与经典模型相比,确实需要构建以提高性能。在这项工作中,我们完全解决了这个问题。首先,我们提出了一个混合量子古典的增强学习模型,该模型解决了一个湿滑的随机冷冻湖,这种环境比确定性的冷冻湖更加困难。其次,研究了不同的量子体系结构作为这种混合量子古典增强学习模型的选项,所有量子体系结构模型都是由文献吸引的。对于类似的经典变体,它们都表现出非常有前途的表演。我们进一步通过与基准测试量子电路的力量(例如纠缠能力,表达性和电路的信息密度)相关的指标来表征这些选择。但是,我们发现这些典型的指标不能直接预测QRL模型的性能。

Quantum reinforcement learning (QRL) models augment classical reinforcement learning schemes with quantum-enhanced kernels. Different proposals on how to construct such models empirically show a promising performance. In particular, these models might offer a reduced parameter count and shorter times to reach a solution than classical models. It is however presently unclear how these quantum-enhanced kernels as subroutines within a reinforcement learning pipeline need to be constructed to indeed result in an improved performance in comparison to classical models. In this work we exactly address this question. First, we propose a hybrid quantum-classical reinforcement learning model that solves a slippery stochastic frozen lake, an environment considerably more difficult than the deterministic frozen lake. Secondly, different quantum architectures are studied as options for this hybrid quantum-classical reinforcement learning model, all of them well-motivated by the literature. They all show very promising performances with respect to similar classical variants. We further characterize these choices by metrics that are relevant to benchmark the power of quantum circuits, such as the entanglement capability, the expressibility, and the information density of the circuits. However, we find that these typical metrics do not directly predict the performance of a QRL model.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源