论文标题

通过拓扑预处理解决分散学习中的数据异质性

Addressing Data Heterogeneity in Decentralized Learning via Topological Pre-processing

论文作者

Abebe, Waqwoya, Jannesari, Ali

论文摘要

最近,在存在数据异质性的情况下,本地同伴拓扑会影响分散学习(DL)图的总体收敛。在本文中,我们证明了构建基于代理的本地异质DL拓扑的优势,以增强收敛并维护数据隐私。特别是,我们提出了一种新型的同伴结块策略,以在将其安排在最终训练图中之前有效地聚类。通过显示局部异质图如何优于相似大小和相同全局数据分布的局部均匀图,我们为拓扑预处理提供了有力的案例。此外,我们通过展示提出的拓扑预处理开销方式在大图中保持很小而绩效增长更加明显,从而证明了方法的可扩展性。此外,我们在存在网络分区的情况下显示了我们方法的鲁棒性。

Recently, local peer topology has been shown to influence the overall convergence of decentralized learning (DL) graphs in the presence of data heterogeneity. In this paper, we demonstrate the advantages of constructing a proxy-based locally heterogeneous DL topology to enhance convergence and maintain data privacy. In particular, we propose a novel peer clumping strategy to efficiently cluster peers before arranging them in a final training graph. By showing how locally heterogeneous graphs outperform locally homogeneous graphs of similar size and from the same global data distribution, we present a strong case for topological pre-processing. Moreover, we demonstrate the scalability of our approach by showing how the proposed topological pre-processing overhead remains small in large graphs while the performance gains get even more pronounced. Furthermore, we show the robustness of our approach in the presence of network partitions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源