论文标题

RIPSNET:一种通用架构,用于快速,强大地估计点云的持续同源性

RipsNet: a general architecture for fast and robust estimation of the persistent homology of point clouds

论文作者

de Surrel, Thibault, Hensel, Felix, Carrière, Mathieu, Lacombe, Théo, Ike, Yuichi, Kurihara, Hiroaki, Glisse, Marc, Chazal, Frédéric

论文摘要

在现代机器学习应用中使用拓扑描述符,例如由拓扑数据分析(TDA)引起的持久图(PDS),在各个领域都显示出很大的潜力。但是,它们在应用中的实际用途通常受到两个主要局限性的阻碍:精确计算此类描述符所需的计算复杂性,以及它们对低级比例异常值的敏感性。在这项工作中,我们建议通过委托在点云顶部构建的(矢量化)PD的估计到我们称为RIPSNET的神经网络体系结构,以绕过数据驱动的设置。一旦对给定数据集进行了培训,RIPSNET可以通过概括能力非常有效地估算测试数据的拓扑描述。此外,我们证明RIPSNET在1-Wasserstein距离方面对输入扰动具有鲁棒性,这是对PDS的标准计算的重大改进,而PD的标准计算仅具有Hausdorff稳定性,从而使RIPSNet在噪声设置中实质上超过了精确的PDS。我们在合成数据和现实世界数据上展示了RIPSNET的使用。我们的开源实现可在https://github.com/hensel-f/ripsnet上公开获得,并将包含在Gudhi库中。

The use of topological descriptors in modern machine learning applications, such as Persistence Diagrams (PDs) arising from Topological Data Analysis (TDA), has shown great potential in various domains. However, their practical use in applications is often hindered by two major limitations: the computational complexity required to compute such descriptors exactly, and their sensitivity to even low-level proportions of outliers. In this work, we propose to bypass these two burdens in a data-driven setting by entrusting the estimation of (vectorization of) PDs built on top of point clouds to a neural network architecture that we call RipsNet. Once trained on a given data set, RipsNet can estimate topological descriptors on test data very efficiently with generalization capacity. Furthermore, we prove that RipsNet is robust to input perturbations in terms of the 1-Wasserstein distance, a major improvement over the standard computation of PDs that only enjoys Hausdorff stability, yielding RipsNet to substantially outperform exactly-computed PDs in noisy settings. We showcase the use of RipsNet on both synthetic and real-world data. Our open-source implementation is publicly available at https://github.com/hensel-f/ripsnet and will be included in the Gudhi library.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源