论文标题

内存门控的复发网络

Memory-Gated Recurrent Networks

论文作者

Zhang, Yaquan, Wu, Qi, Peng, Nanbo, Dai, Min, Zhang, Jing, Wang, Hu

论文摘要

多元顺序学习的本质是关于如何在数据中提取依赖性的全部。这些数据集,例如重症监护设备中的小时医疗记录和多频语音时间序列,通常时间不仅显示单个组件中的强烈串行依赖性(“边际”记忆),而且在横截面依赖性(“关节”内存)中也表现出不可忽略的记忆。由于基于数据生成过程的关节分布的演变的多变量复杂性,我们采用数据驱动的方法并构建一种新型的经常性网络架构,称为内存通心网络(MGRN),盖茨明确调节了两种不同类型的记忆:边际记忆和关节内存。通过在一系列公共数据集上进行全面的仿真研究和经验实验的结合,我们表明我们提出的MGRN体系结构始终优于针对多元时间序列的最先进的体系结构。

The essence of multivariate sequential learning is all about how to extract dependencies in data. These data sets, such as hourly medical records in intensive care units and multi-frequency phonetic time series, often time exhibit not only strong serial dependencies in the individual components (the "marginal" memory) but also non-negligible memories in the cross-sectional dependencies (the "joint" memory). Because of the multivariate complexity in the evolution of the joint distribution that underlies the data generating process, we take a data-driven approach and construct a novel recurrent network architecture, termed Memory-Gated Recurrent Networks (mGRN), with gates explicitly regulating two distinct types of memories: the marginal memory and the joint memory. Through a combination of comprehensive simulation studies and empirical experiments on a range of public datasets, we show that our proposed mGRN architecture consistently outperforms state-of-the-art architectures targeting multivariate time series.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源