论文标题

任意手写图像样式转移

Arbitrary Handwriting Image Style Transfer

论文作者

Yang, Kai, Liang, Xiaoman, Zhao, Huihuang

论文摘要

本文提出了一种通过样式转移模仿笔迹样式的方法。我们提出了一种基于条件生成对抗网络(CGAN)的神经网络模型,用于手写样式转移。本文根据GAN提高了损失函数。与其他手写模仿方法相比,手写样式转移的效果和效率得到了显着提高。实验表明,生成的汉字的形状清晰明了,实验数据的分析表明,生成对抗网络在手写样式转移方面表现出色。生成的文本图像更接近真实的笔迹,并在手写模仿方面取得了更好的性能。

This paper proposed a method to imitate handwriting style by style transfer. We proposed an neural network model based on conditional generative adversarial networks (cGAN) for handwriting style transfer. This paper improved the loss function on the basis of the GAN. Compared with other handwriting imitation methods, the handwriting style transfer's effect and efficiency have been significantly improved. The experiments showed that the shape of the generated Chinese characters is clear and the analysis of experimental data showed the Generative adversarial networks showed excellent performance in handwriting style transfer. The generated text image is closer to the real handwriting and achieved a better performance in term of handwriting imitation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源