论文标题

马尔可夫来源的渐近最佳随机损失编码

Asymptotically Optimal Stochastic Lossy Coding of Markov Sources

论文作者

Elshafiy, Ahmed, Rose, Kenneth

论文摘要

本文提出了一种有效的“在线”机制,用于使用字符串匹配技术对马尔可夫来源进行随机损耗编码。 Earlier work has shown that the rate-distortion bound can be asymptotically achieved by a 'natural type selection' (NTS) mechanism which iteratively encodes asymptotically long source strings (from an unknown source distribution P) and regenerates the codebook according to a maximum likelihood distribution framework, after observing a set of K codewords to 'd-match' (i.e., satisfy the distortion constraint for) a respective set of K source 字。后来,在源单词必须包含一系列渐近长度向量(或超符号)的序列的假设下,该结果的源为有记忆的来源,即源超级字母(即,即源被认为是向量源)。但是,较早的结果遭受了一个重大的实际缺陷,更具体地说,它需要将超符号(以及相应的超级字母)长度扩展到无穷大的长度,以达到有限的记忆源,例如Markov源。这意味着NTS迭代的复杂性将超出任何实际功能,从而损害了NTS算法在具有内存源来源的实际情况下的承诺。这项工作描述了在量身定制的马尔可夫来源的实用框架内,鉴于规定的内存约束,鉴于规定的记忆约束,可以实现渐近最佳性能。更具体地说,该算法在具有规定的订单的Markov属性的一组受约束的分布集中渐近地发现了最佳代码书复制分布,该分布在维持指定的失真级别的同时,可以达到每个字母编码率的最低编码率。

An effective 'on-the-fly' mechanism for stochastic lossy coding of Markov sources using string matching techniques is proposed in this paper. Earlier work has shown that the rate-distortion bound can be asymptotically achieved by a 'natural type selection' (NTS) mechanism which iteratively encodes asymptotically long source strings (from an unknown source distribution P) and regenerates the codebook according to a maximum likelihood distribution framework, after observing a set of K codewords to 'd-match' (i.e., satisfy the distortion constraint for) a respective set of K source words. This result was later generalized for sources with memory under the assumption that the source words must contain a sequence of asymptotic-length vectors (or super-symbols) over the source super-alphabet, i.e., the source is considered a vector source. However, the earlier result suffers from a significant practical flaw, more specifically, it requires expanding the super-symbols (and correspondingly the super-alphabet) lengths to infinity in order to achieve the rate-distortion bound, even for finite memory sources, e.g., Markov sources. This implies that the complexity of the NTS iteration will explode beyond any practical capabilities, thus compromising the promise of the NTS algorithm in practical scenarios for sources with memory. This work describes a considerably more efficient and tractable mechanism to achieve asymptotically optimal performance given a prescribed memory constraint, within a practical framework tailored to Markov sources. More specifically, the algorithm finds asymptotically the optimal codebook reproduction distribution, within a constrained set of distributions having Markov property with a prescribed order, that achieves the minimum per letter coding rate while maintaining a specified distortion level.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源