论文标题

支柱:实时和高性能支柱的3D对象检测

PillarNet: Real-Time and High-Performance Pillar-based 3D Object Detection

论文作者

Shi, Guangsheng, Li, Ruifeng, Ma, Chao

论文摘要

实时和高性能3D对象检测对于自动驾驶至关重要。最近表现最佳的3D对象探测器主要依赖于基于点或基于3D体素的卷积,这在计算上均在计算上的部署效率低下。相比之下,基于支柱的方法仅使用2D卷积,从而消耗了较少的计算资源,但它们的检测准确性远远落后于基于体素的对应物。在本文中,通过检查基于支柱和体素的探测器之间的主要性能差距,我们开发了一种实时和高性能的基于支柱的探测器,被称为Pillarnet。拟议的Pillarnet由一个有效的柱子网络组成,可用于有效的Pillar特征学习,一种用于空间性特征特征特征的颈部网络,用于空间型的特征融合和常用的检测头。仅使用2D卷积,Pillarnet具有可选的支柱尺寸的灵活性,并且与经典的2D CNN骨架(例如VGGNET和RESNET)兼容。此外,Pillarnet受益于我们设计的定向折叠回归损失以及Iou-Aware预测分支。大规模NUSCENES数据集和Waymo Open数据集的广泛实验结果表明,在有效性和效率方面,提出的Pillarnet在最先进的3D检测器上表现良好。代码可在\ url {https://github.com/agent-sgs/pillarnet}中获得。

Real-time and high-performance 3D object detection is of critical importance for autonomous driving. Recent top-performing 3D object detectors mainly rely on point-based or 3D voxel-based convolutions, which are both computationally inefficient for onboard deployment. In contrast, pillar-based methods use solely 2D convolutions, which consume less computation resources, but they lag far behind their voxel-based counterparts in detection accuracy. In this paper, by examining the primary performance gap between pillar- and voxel-based detectors, we develop a real-time and high-performance pillar-based detector, dubbed PillarNet.The proposed PillarNet consists of a powerful encoder network for effective pillar feature learning, a neck network for spatial-semantic feature fusion and the commonly used detect head. Using only 2D convolutions, PillarNet is flexible to an optional pillar size and compatible with classical 2D CNN backbones, such as VGGNet and ResNet. Additionally, PillarNet benefits from our designed orientation-decoupled IoU regression loss along with the IoU-aware prediction branch. Extensive experimental results on the large-scale nuScenes Dataset and Waymo Open Dataset demonstrate that the proposed PillarNet performs well over state-of-the-art 3D detectors in terms of effectiveness and efficiency. Code is available at \url{https://github.com/agent-sgs/PillarNet}.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源