论文标题

非线性维度降低的深层歧管变换

Deep Manifold Transformation for Nonlinear Dimensionality Reduction

论文作者

Li, Stan Z., Zang, Zelin, Wu, Lirong

论文摘要

基于多种学习的编码器一直在非线性维度降低(NLDR)中发挥重要作用,以进行数据探索。但是,现有方法通常无法保留数据的几何,拓扑和/或分布结构。在本文中,我们提出了一个深层的多种学习框架,称为无监督的NLDR和嵌入学习的深层多种转化(DMT)。 DMT通过使用跨层局部保留(LGP)约束来增强深层神经网络。 LGP的约束构成了深层流动学习的损失,并充当NLDR网络培训的几何正规化器。关于合成和现实世界数据的广泛实验表明,在保留数据结构方面,DMT网络的表现优于现有基于领先的NLDR方法。

Manifold learning-based encoders have been playing important roles in nonlinear dimensionality reduction (NLDR) for data exploration. However, existing methods can often fail to preserve geometric, topological and/or distributional structures of data. In this paper, we propose a deep manifold learning framework, called deep manifold transformation (DMT) for unsupervised NLDR and embedding learning. DMT enhances deep neural networks by using cross-layer local geometry-preserving (LGP) constraints. The LGP constraints constitute the loss for deep manifold learning and serve as geometric regularizers for NLDR network training. Extensive experiments on synthetic and real-world data demonstrate that DMT networks outperform existing leading manifold-based NLDR methods in terms of preserving the structures of data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源