论文标题

密集的残留网络:增强全球密集特征流以识别角色

Dense Residual Network: Enhancing Global Dense Feature Flow for Character Recognition

论文作者

Zhang, Zhao, Tang, Zemin, Wang, Yang, Zhang, Zheng, Zhan, Choujun, Zha, Zhengjun, Wang, Meng

论文摘要

深度卷积神经网络(CNN),例如密集的卷积网络(Densenet),通过发现深层层次信息,在图像表示方面取得了巨大的成功。但是,大多数现有的网络只是堆叠卷积层,因此未能在层中充分发现本地和全局特征信息。在本文中,我们主要探讨如何通过从所有卷积层中充分利用层次结构特征来增强本地和全球密集的特征流。从技术上讲,我们提出了一个有效有效的CNN框架,即快速密集的残留网络(FDRN),以供文本识别。为了构建FDRN,我们提出了一个新的快速残差密集块(F-RDB),以保留本地特征融合和原始RDB的局部残留学习的能力,这可以同时减少计算工作。在充分了解局部残差密度特征之后,我们利用总和操作和几个F-RDB来定义一个称为全球密集块(GDB)的新块,通过模仿密集块的构建以以整体方式自适应地学习全球密集的残差特征。最后,我们使用两个卷积层来构建一个下采样块,以减少全局特征大小并提取更深的特征。广泛的模拟表明,与其他相关模型相比,FDRN获得了增强的识别结果。

Deep Convolutional Neural Networks (CNNs), such as Dense Convolutional Networks (DenseNet), have achieved great success for image representation by discovering deep hierarchical information. However, most existing networks simply stacks the convolutional layers and hence failing to fully discover local and global feature information among layers. In this paper, we mainly explore how to enhance the local and global dense feature flow by exploiting hierarchical features fully from all the convolution layers. Technically, we propose an efficient and effective CNN framework, i.e., Fast Dense Residual Network (FDRN), for text recognition. To construct FDRN, we propose a new fast residual dense block (f-RDB) to retain the ability of local feature fusion and local residual learning of original RDB, which can reduce the computing efforts at the same time. After fully learning local residual dense features, we utilize the sum operation and several f-RDBs to define a new block termed global dense block (GDB) by imitating the construction of dense blocks to learn global dense residual features adaptively in a holistic way. Finally, we use two convolution layers to construct a down-sampling block to reduce the global feature size and extract deeper features. Extensive simulations show that FDRN obtains the enhanced recognition results, compared with other related models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源