论文标题

缩小邻接矩阵的领先和非领先特征向量的网络上的动态系统的尺寸

Dimension reduction of dynamical systems on networks with leading and non-leading eigenvectors of adjacency matrices

论文作者

Masuda, Naoki, Kundu, Prosenjit

论文摘要

考虑到网络动力学系统的降低技术被认为是为了促进我们对原始高维动力学的理解。降低维度的一种策略是得出一个低维动力系统,其行为近似于原始动力学系统的可观察力,这是在不同节点处的状态变量的加权线性总和。最近提出的方法使用网络邻近矩阵的领先特征向量作为混合物权重以获得这种可观察物。在本研究中,当我们使用邻接矩阵的非领先特征向量作为混合物重量时,我们探讨了网络上这种动力学系统的一维降低的性能。我们的理论预测,非领先的特征向量可以比领先的特征向量更有效,并使我们能够选择最小化误差的特征向量。我们在数值上验证了最佳的非领先特征向量是否优于某些动态系统和网络的领先特征向量。我们还认为,尽管我们的理论是,最好使用领先的特征向量作为混合物权重,以避免过度地放错叉点,并抵抗动态噪声。

Dimension reduction techniques for dynamical systems on networks are considered to promote our understanding of the original high-dimensional dynamics. One strategy of dimension reduction is to derive a low-dimensional dynamical system whose behavior approximates the observables of the original dynamical system that are weighted linear summations of the state variables at the different nodes. Recently proposed methods use the leading eigenvector of the adjacency matrix of the network as the mixture weights to obtain such observables. In the present study, we explore performances of this type of one-dimensional reductions of dynamical systems on networks when we use non-leading eigenvectors of the adjacency matrix as the mixture weights. Our theory predicts that non-leading eigenvectors can be more efficient than the leading eigenvector and enables us to select the eigenvector minimizing the error. We numerically verify that the optimal non-leading eigenvector outperforms the leading eigenvector for some dynamical systems and networks. We also argue that, despite our theory, it is practically better to use the leading eigenvector as the mixture weights to avoid misplacing the bifurcation point too distantly and to be resistant against dynamical noise.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源