论文标题

通信的能力接近自动编码器

Capacity-Approaching Autoencoders for Communications

论文作者

Letizia, Nunzio A., Tonello, Andrea M.

论文摘要

自动编码器概念促进了现代通信系统的重新解释和设计。它由编码器,频道和解码器块组成,该块以端到端的学习方式修改其内部神经结构。但是,训练自动编码器的当前方法依赖于横向渗透损失函数的使用。这种方法可能容易适应问题,并且通常无法学习最佳系统和信号表示(代码)。此外,关于自动编码器设计通道容量所吸引的代码的能力,即在某些功率约束下最大化输入输出信息的代码。对于容量未知的未知渠道而言,这项任务更为强大,因此必须学习。 在本文中,我们通过将通信渠道的存在纳入自动编码器培训的新型损耗功能中来解决设计能力吸引的代码的挑战。特别是,我们利用传输和接收的信号之间的相互信息作为跨膜损耗函数中的正规化项,目的是控制存储的信息量。通过共同最大化相互信息并最大程度地减少跨凝性,我们提出了一种方法,即a)计算通道容量的估计值,b)构建了接近它的最佳编码信号。几个仿真结果提供了所提出方法的潜力的证据。

The autoencoder concept has fostered the reinterpretation and the design of modern communication systems. It consists of an encoder, a channel, and a decoder block which modify their internal neural structure in an end-to-end learning fashion. However, the current approach to train an autoencoder relies on the use of the cross-entropy loss function. This approach can be prone to overfitting issues and often fails to learn an optimal system and signal representation (code). In addition, less is known about the autoencoder ability to design channel capacity-approaching codes, i.e., codes that maximize the input-output information under a certain power constraint. The task being even more formidable for an unknown channel for which the capacity is unknown and therefore it has to be learnt. In this paper, we address the challenge of designing capacity-approaching codes by incorporating the presence of the communication channel into a novel loss function for the autoencoder training. In particular, we exploit the mutual information between the transmitted and received signals as a regularization term in the cross-entropy loss function, with the aim of controlling the amount of information stored. By jointly maximizing the mutual information and minimizing the cross-entropy, we propose a methodology that a) computes an estimate of the channel capacity and b) constructs an optimal coded signal approaching it. Several simulation results offer evidence of the potentiality of the proposed method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源