论文标题

AS-PD:点云的任意尺寸缩小采样框架

AS-PD: An Arbitrary-Size Downsampling Framework for Point Clouds

论文作者

Zhang, Peng, Xie, Ruoyin, Sun, Jinsheng, Li, Weiqing, Su, Zhiyong

论文摘要

点云下采样是为了统一数据大小并降低计算成本的至关重要的预处理操作,仅举几例。关于点云下采样的最新研究取得了巨大的成功,这重点是学习以任务感知的方式进行样品。但是,现有的可学习采样器不能直接执行任意大小的缩采样,并假定输入大小是固定的。在本文中,我们介绍了AS-PD,这是一种新型的任务感知抽样框架,该框架直接根据样本到refine策略将云直接下调至任何较小的尺寸。给定一个任意大小的输入点云,我们首先在输入点云上对指定样本量执行任务不合时宜的预采样。然后,我们通过完善预采样集来获得采样集,以使其由下游任务损失驱动,以使其成为任务意识。通过添加每个预采样点的偏移量通过点多层多层感知器(MLP)来实现改进。通过密度编码和适当的训练方案,该框架可以学会自适应地将不同输入尺寸的点云到任意样本大小。我们分别评估了分类和注册任务的采样结果。在下游性能方面,提出的AS-PD超过了最新方法。进一步的实验还表明,我们的AS-PD表现出更好的通用性来看不见任务模型,这意味着提出的采样器已针对任务而不是指定的任务模型进行了优化。

Point cloud downsampling is a crucial pre-processing operation to downsample points in order to unify data size and reduce computational cost, to name a few. Recent research on point cloud downsampling has achieved great success which concentrates on learning to sample in a task-aware way. However, existing learnable samplers can not directly perform arbitrary-size downsampling, and assume the input size is fixed. In this paper, we introduce the AS-PD, a novel task-aware sampling framework that directly downsamples point clouds to any smaller size based on a sample-to-refine strategy. Given an input point cloud of arbitrary size, we first perform a task-agnostic pre-sampling on the input point cloud to a specified sample size. Then, we obtain the sampled set by refining the pre-sampled set to make it task-aware, driven by downstream task losses. The refinement is realized by adding each pre-sampled point with a small offset predicted by point-wise multi-layer perceptrons (MLPs). With the density encoding and proper training scheme, the framework can learn to adaptively downsample point clouds of different input sizes to arbitrary sample sizes. We evaluate sampled results for classification and registration tasks, respectively. The proposed AS-PD surpasses the state-of-the-art method in terms of downstream performance. Further experiments also show that our AS-PD exhibits better generality to unseen task models, implying that the proposed sampler is optimized to the task rather than a specified task model.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源