论文标题

通过Kullback-Leibler Divergence协方差拟合,对均匀线性阵列的盲目校准增强了盲目校准

Enhanced Blind Calibration of Uniform Linear Arrays with One-Bit Quantization by Kullback-Leibler Divergence Covariance Fitting

论文作者

Weiss, Amir, Yeredor, Arie

论文摘要

由于对低功率和更高的采样率的需求不断增长,一位量化最近已成为尖端应用程序数据获取的有吸引力的选择。随后,由于“经典”阵列处理技术得到了相应的调整 /修改,因此复兴的一位阵列处理字段现在正在受到更多关注。但是,阵列校准通常是阵列处理中的仪器初步阶段,到目前为止,其一位形式很少受到关注。在本文中,我们提出了一种用于盲目校准问题的新型解决方案方法,即不使用已知的校准信号。为了在量化测量的二阶统计数据中提取信息,我们建议根据Kullback-Leibler Divergence(KLD)协方差拟合标准估算未知传感器的增益和相位偏移。然后,我们提供了一种准Newton溶液算法,并具有一致的初始估计,并证明了基于KLD的估计值在模拟中提高了准确性。

One-bit quantization has recently become an attractive option for data acquisition in cutting edge applications, due to the increasing demand for low power and higher sampling rates. Subsequently, the rejuvenated one-bit array processing field is now receiving more attention, as "classical" array processing techniques are adapted / modified accordingly. However, array calibration, often an instrumental preliminary stage in array processing, has so far received little attention in its one-bit form. In this paper, we present a novel solution approach for the blind calibration problem, namely, without using known calibration signals. In order to extract information within the second-order statistics of the quantized measurements, we propose to estimate the unknown sensors' gains and phases offsets according to a Kullback-Leibler Divergence (KLD) covariance fitting criterion. We then provide a quasi-Newton solution algorithm, with a consistent initial estimate, and demonstrate the improved accuracy of our KLD-based estimates in simulations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源