论文标题

转移学习到中微子相互作用分类的应用

Application of Transfer Learning to Neutrino Interaction Classification

论文作者

Chappell, Andrew, Whitehead, Leigh H.

论文摘要

使用模拟训练深层神经网络通常需要大量的模拟事件。当无法产生足够数量的事件时,这可能是巨大的计算负担,并且在深度学习算法的性能中的限制。我们研究了转移学习的使用,其中使用了一组模拟图像来微调在通用图像识别任务上训练的模型,并将其与液体氩时间投影室中Neutrino相互作用分类的特定用例。使用模拟的中微子图像进行了微调,在摄影图像上进行了预先训练,并在接受十万培训活动的培训时进行了微调,而从一个随机限制的网络接受了相同的培训样本,则为0.896 \ pm 0.002 $ 0.836 \ pm 0.004 $。转移学习的网络还表现出较低的偏差,这是能量的函数,并且在不同的相互作用类型上表现出更平衡的性能。

Training deep neural networks using simulations typically requires very large numbers of simulated events. This can be a large computational burden and a limitation in the performance of the deep learning algorithm when insufficient numbers of events can be produced. We investigate the use of transfer learning, where a set of simulated images are used to fine tune a model trained on generic image recognition tasks, to the specific use case of neutrino interaction classification in a liquid argon time projection chamber. A ResNet18, pre-trained on photographic images, was fine-tuned using simulated neutrino images and when trained with one hundred thousand training events reached an F1 score of $0.896 \pm 0.002$ compared to $0.836 \pm 0.004$ from a randomly-initialised network trained with the same training sample. The transfer-learned networks also demonstrate lower bias as a function of energy and more balanced performance across different interaction types.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源