论文标题

线频谱估计的计算分辨率限制理论

A Theory of Computational Resolution Limit for Line Spectral Estimation

论文作者

Liu, Ping, Zhang, Hai

论文摘要

线频谱估计是一个经典的信号处理问题,旨在从其信号中估算线光谱,该信号被确定性或随机噪声污染。尽管对该主题进行了大量研究,但对这个问题的理论理解仍然难以捉摸。在本文中,我们介绍并定量地表征了确定性噪声下线频谱估计问题的两个分辨率限制:一个是线光谱之间确切检测其数量所需的最小分离距离,而另一个是线光谱之间的最小分离距离所需的最小分离距离,这是稳定的支撑恢复。定量结果意味着在两个恢复问题中的每个问题中都存在相变现象,也意味着两者之间的微妙差异。我们进一步提出了一种用于数量检测问题并进行数值实验的清晰的单数鉴定算法。数值结果证实了数量检测问题中的相变现象。

Line spectral estimation is a classical signal processing problem that aims to estimate the line spectra from their signal which is contaminated by deterministic or random noise. Despite a large body of research on this subject, the theoretical understanding of this problem is still elusive. In this paper, we introduce and quantitatively characterize the two resolution limits for the line spectral estimation problem under deterministic noise: one is the minimum separation distance between the line spectra that is required for exact detection of their number, and the other is the minimum separation distance between the line spectra that is required for a stable recovery of their supports. The quantitative results imply a phase transition phenomenon in each of the two recovery problems, and also the subtle difference between the two. We further propose a sweeping singular-value-thresholding algorithm for the number detection problem and conduct numerical experiments. The numerical results confirm the phase transition phenomenon in the number detection problem.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源