论文标题
在部分参与设置中用于分布式非凸问题的计算和通信有效方法
A Computation and Communication Efficient Method for Distributed Nonconvex Problems in the Partial Participation Setting
论文作者
论文摘要
我们提出了一种新方法,其中包括分布式优化和联合学习的三个关键组成部分:降低随机梯度,部分参与和压缩通信的方差。我们证明,在部分参与设置中,新方法具有最佳的Oracle复杂性和最新的通信复杂性。无论沟通压缩功能如何,我们的方法都成功地结合了差异降低和部分参与:我们获得了最佳的甲骨文复杂性,不需要所有节点的参与,并且不需要有限的梯度(差异)假设。
We present a new method that includes three key components of distributed optimization and federated learning: variance reduction of stochastic gradients, partial participation, and compressed communication. We prove that the new method has optimal oracle complexity and state-of-the-art communication complexity in the partial participation setting. Regardless of the communication compression feature, our method successfully combines variance reduction and partial participation: we get the optimal oracle complexity, never need the participation of all nodes, and do not require the bounded gradients (dissimilarity) assumption.