论文标题
离散的树通过树结构的排列流动
Discrete Tree Flows via Tree-Structured Permutations
论文作者
论文摘要
尽管对连续数据的归一流流进行了广泛的研究,但直到最近才探索了离散数据的流量。但是,这些先前的模型遭受了与连续流的局限性的限制。最值得注意的是,由于离散函数的梯度不确定或零,因此无法直接优化基于流动的模型。先前的作品近似离散功能的伪级,但不能在基本层面上解决该问题。除此之外,与替代离散算法(例如决策树算法)相比,反向传播可能是计算繁重的。我们的方法旨在减轻计算负担,并通过基于决策树开发离散流程来消除对伪级的需求,这是基于有效的基于树的基于有效的树的方法进行分类和回归的离散数据。我们首先定义了树结构化置换(TSP),该置换量(TSP)紧凑地编码离散数据的排列,而逆向易于计算;因此,我们可以有效地计算密度值并采样新数据。然后,我们提出了一种决策树算法来构建TSP,该TSP通过新标准在每个节点上学习树结构和排列。我们从经验上证明了我们在多个数据集上方法的可行性。
While normalizing flows for continuous data have been extensively researched, flows for discrete data have only recently been explored. These prior models, however, suffer from limitations that are distinct from those of continuous flows. Most notably, discrete flow-based models cannot be straightforwardly optimized with conventional deep learning methods because gradients of discrete functions are undefined or zero. Previous works approximate pseudo-gradients of the discrete functions but do not solve the problem on a fundamental level. In addition to that, backpropagation can be computationally burdensome compared to alternative discrete algorithms such as decision tree algorithms. Our approach seeks to reduce computational burden and remove the need for pseudo-gradients by developing a discrete flow based on decision trees -- building upon the success of efficient tree-based methods for classification and regression for discrete data. We first define a tree-structured permutation (TSP) that compactly encodes a permutation of discrete data where the inverse is easy to compute; thus, we can efficiently compute the density value and sample new data. We then propose a decision tree algorithm to build TSPs that learns the tree structure and permutations at each node via novel criteria. We empirically demonstrate the feasibility of our method on multiple datasets.