论文标题
深度机器学习重建晶格拓扑,并具有强烈的热波动
Deep Machine Learning Reconstructing Lattice Topology with Strong Thermal Fluctuations
论文作者
论文摘要
目前正在接受人工智能(即科学的AI),将人工智能应用于科学问题(即AI)。但是,科学问题与传统的问题,带有图像,文本等的传统问题有很大不同,在这些问题中,由于不平衡的科学数据和物理设置的复杂效果而出现了新的挑战。在这项工作中,我们证明了深卷积神经网络(CNN)在存在强热波动和不平衡数据的情况下重建晶格拓扑(即自旋连接性)的有效性。以Glauber动力学为例,以动力学模型为例,CNN映射了从特定的初始配置(称为演化实例)演变为时间依赖性的局部磁力矩(单节点特征),以使其成为可能的偶数的概率。我们的方案与以前可能需要有关节点动力学的知识,来自扰动的响应或统计量的评估(例如相关或传输熵)与许多进化实例的评估。微调避免了高温下强烈的热波动引起的“贫瘠高原”。可以进行准确的重建,而热波动在相关性上占主导地位,因此通常是统计方法。同时,我们揭示了CNN的概括,以处理从未核心的初始旋转配置以及具有未固定晶格的实例演变而来的。我们在几乎“双重指数”大型样本空间中使用不平衡的数据提出了一个关于学习的公开问题。
Applying artificial intelligence to scientific problems (namely AI for science) is currently under hot debate. However, the scientific problems differ much from the conventional ones with images, texts, and etc., where new challenges emerges with the unbalanced scientific data and complicated effects from the physical setups. In this work, we demonstrate the validity of the deep convolutional neural network (CNN) on reconstructing the lattice topology (i.e., spin connectivities) in the presence of strong thermal fluctuations and unbalanced data. Taking the kinetic Ising model with Glauber dynamics as an example, the CNN maps the time-dependent local magnetic momenta (a single-node feature) evolved from a specific initial configuration (dubbed as an evolution instance) to the probabilities of the presences of the possible couplings. Our scheme distinguishes from the previous ones that might require the knowledge on the node dynamics, the responses from perturbations, or the evaluations of statistic quantities such as correlations or transfer entropy from many evolution instances. The fine tuning avoids the "barren plateau" caused by the strong thermal fluctuations at high temperatures. Accurate reconstructions can be made where the thermal fluctuations dominate over the correlations and consequently the statistic methods in general fail. Meanwhile, we unveil the generalization of CNN on dealing with the instances evolved from the unlearnt initial spin configurations and those with the unlearnt lattices. We raise an open question on the learning with unbalanced data in the nearly "double-exponentially" large sample space.