论文标题
循环图张量网络:用于建模高维多路序列的低复杂框架
Recurrent Graph Tensor Networks: A Low-Complexity Framework for Modelling High-Dimensional Multi-Way Sequence
论文作者
论文摘要
复发性神经网络(RNN)是序列建模最成功的机器学习模型之一,但是在处理大型多维数据时,参数数量的呈指数增加。为此,我们开发了一个多线性图形滤波器框架,用于近似RNN中隐藏状态的建模,该框架嵌入了张量网络体系结构中,以提高建模功率并降低参数复杂性,从而导致新颖的重复图形张量张量网络(RGTN)。提出的框架通过多个多路序列建模任务进行了验证,并针对传统RNN进行了标准。借助图形过滤器的域意识信息处理和张量网络的表达能力,我们表明所提出的RGTN不仅能够超过表现的标准RNN,而且能够减轻与传统RNN相关的维度诅咒,从而在性能和复杂性方面表现出了出色的性质。
Recurrent Neural Networks (RNNs) are among the most successful machine learning models for sequence modelling, but tend to suffer from an exponential increase in the number of parameters when dealing with large multidimensional data. To this end, we develop a multi-linear graph filter framework for approximating the modelling of hidden states in RNNs, which is embedded in a tensor network architecture to improve modelling power and reduce parameter complexity, resulting in a novel Recurrent Graph Tensor Network (RGTN). The proposed framework is validated through several multi-way sequence modelling tasks and benchmarked against traditional RNNs. By virtue of the domain aware information processing of graph filters and the expressive power of tensor networks, we show that the proposed RGTN is capable of not only out-performing standard RNNs, but also mitigating the Curse of Dimensionality associated with traditional RNNs, demonstrating superior properties in terms of performance and complexity.