论文标题
在液体氩时间投影室中减少中微子相互作用事件过滤中中微子相互作用事件中模拟偏差的对抗方法
Adversarial methods to reduce simulation bias in neutrino interaction event filtering at Liquid Argon Time Projection Chambers
论文作者
论文摘要
对于使用大型液体氩时间投影室(LAR-TPC)的当前和将来的中微子振荡实验,一个关键的挑战是识别来自普遍存在的宇宙射线背景的中微子相互作用。通常可以使用基于传统的剪切选择来拒绝这种背景,但这通常需要先前使用计算昂贵的重建算法。这项工作证明了一种使用3D Submanifold稀疏卷积网络的另一种方法,该卷积网络在LAR-TPC中的相互作用的闪烁信号中对低级信息进行了训练。该技术应用于伊卡洛斯(Icarus)的示例模拟,伊卡洛斯(Icarus)是费米拉布(Fermilab)短基线中微子(SBN)程序的远检测器。该网络的结果表明,宇宙背景降低了76.3%,而中微子相互作用的选择效率保持在98.9%以上。我们进一步提出了一种通过应用域对抗神经网络(DANN)来减轻不完美输入模拟的潜在偏见的方法,为此引入了修改的模拟样本以模仿真实数据,其中一小部分用于对抗训练。进行了一系列模拟数据研究,并证明了使用DANN减轻偏见的有效性,显示中微子相互作用选择效率的性能明显好于没有对抗性训练而实现的效率。
For current and future neutrino oscillation experiments using large Liquid Argon Time Projection Chambers (LAr-TPCs), a key challenge is identifying neutrino interactions from the pervading cosmic-ray background. Rejection of such background is often possible using traditional cut-based selections, but this typically requires the prior use of computationally expensive reconstruction algorithms. This work demonstrates an alternative approach of using a 3D Submanifold Sparse Convolutional Network trained on low-level information from the scintillation light signal of interactions inside LAr-TPCs. This technique is applied to example simulations from ICARUS, the far detector of the Short Baseline Neutrino (SBN) program at Fermilab. The results of the network, show that cosmic background is reduced by up to 76.3% whilst neutrino interaction selection efficiency remains over 98.9%. We further present a way to mitigate potential biases from imperfect input simulations by applying Domain Adversarial Neural Networks (DANNs), for which modified simulated samples are introduced to imitate real data and a small portion of them are used for adverserial training. A series of mock-data studies are performed and demonstrate the effectiveness of using DANNs to mitigate biases, showing neutrino interaction selection efficiency performances significantly better than that achieved without the adversarial training.