论文标题
用于测试随时间变化的熵的方差,并应用于模因股票
Variance of entropy for testing time-varying regimes with an application to meme stocks
论文作者
论文摘要
香农熵是在许多领域中测量时间序列随机程度的最常见指标,从物理和金融到医学和生物学。现实世界中的系统通常是非静止的,其熵值在时间上并不恒定。本文的目的是提出一种假设测试程序,以测试时间序列的恒定香农熵的零假设,以替代随后两个时期之间熵的显着变化。为此,我们发现了Shannon熵估计器的方差的公正近似,直至o(n^( - 4))的顺序,n样品尺寸为n。为了表征估计器的方差,我们首先获得了二项式和多项式分布的中央矩的显式公式,这些公式描述了香农熵的分布。其次,我们发现,通过基于计算一个时间窗口内熵的显着变化的计算,通过优化新型的自洽标准来估算时间变化的香农熵的最佳长度。我们通过使用新颖的方法来证实我们的发现,以测试股票价格动态时期的熵方案,特别是考虑到2020年和2021年的模因股票的情况。我们从经验上表明,模因股票的市场效率低下时期存在。特别是,价格和交易量的急剧上涨对应于香农熵的统计学意义下降。
Shannon entropy is the most common metric to measure the degree of randomness of time series in many fields, ranging from physics and finance to medicine and biology. Real-world systems may be in general non stationary, with an entropy value that is not constant in time. The goal of this paper is to propose a hypothesis testing procedure to test the null hypothesis of constant Shannon entropy for time series, against the alternative of a significant variation of the entropy between two subsequent periods. To this end, we find an unbiased approximation of the variance of the Shannon entropy's estimator, up to the order O(n^(-4)) with n the sample size. In order to characterize the variance of the estimator, we first obtain the explicit formulas of the central moments for both the binomial and the multinomial distributions, which describe the distribution of the Shannon entropy. Second, we find the optimal length of the rolling window used for estimating the time-varying Shannon entropy by optimizing a novel self-consistent criterion based on the counting of significant variations of entropy within a time window. We corroborate our findings by using the novel methodology to test for time-varying regimes of entropy for stock price dynamics, in particular considering the case of meme stocks in 2020 and 2021. We empirically show the existence of periods of market inefficiency for meme stocks. In particular, sharp increases of prices and trading volumes correspond to statistically significant drops of Shannon entropy.