论文标题

学习独立实例地图以进行人群本地化

Learning Independent Instance Maps for Crowd Localization

论文作者

Gao, Junyu, Han, Tao, Wang, Qi, Yuan, Yuan, Li, Xuelong

论文摘要

在人群分析领域,准确地定位每个人的位置是一项至关重要的任务。但是,传统的基于密度的方法仅预测粗略的预测,而基于分割/检测的方法无法处理极度密集的场景和大范围的规模差异人群。为此,我们为人群本地化提出了一个端到端,直接的框架,称为独立实例地图细分(IIM)。与密度图和盒子回归不同,IIM中的每个实例都不被重叠。通过将人群分割成独立的连接组件,可以获得位置和人群数(分别为中心和组件的数量)。此外,为了提高不同密度区域的分割质量,我们向输出结构化实例图提供了一个可区分的二进制模块(BM)。 BM将两个优点带入了本地化模型:1)自适应地学习不同图像的阈值图,以更准确地检测每个实例; 2)使用二进制预测和标签上的损失直接训练模型。广泛的实验验证所提出的方法是有效的,并且在五个流行的人群数据集中胜过了状态的方法。值得注意的是,IIM在NWPU-CROWD本地化任务上将F1量化提高了10.4%。源代码和预培训模型将在https://github.com/taohan10200/iim上发布。

Accurately locating each head's position in the crowd scenes is a crucial task in the field of crowd analysis. However, traditional density-based methods only predict coarse prediction, and segmentation/detection-based methods cannot handle extremely dense scenes and large-range scale-variations crowds. To this end, we propose an end-to-end and straightforward framework for crowd localization, named Independent Instance Map segmentation (IIM). Different from density maps and boxes regression, each instance in IIM is non-overlapped. By segmenting crowds into independent connected components, the positions and the crowd counts (the centers and the number of components, respectively) are obtained. Furthermore, to improve the segmentation quality for different density regions, we present a differentiable Binarization Module (BM) to output structured instance maps. BM brings two advantages into localization models: 1) adaptively learn a threshold map for different images to detect each instance more accurately; 2) directly train the model using loss on binary predictions and labels. Extensive experiments verify the proposed method is effective and outperforms the-state-of-the-art methods on the five popular crowd datasets. Significantly, IIM improves F1-measure by 10.4% on the NWPU-Crowd Localization task. The source code and pre-trained models will be released at https://github.com/taohan10200/IIM.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源