论文标题

FedIlc:对非IID数据联合学习的加权几何平均值和不变梯度协方差

FedILC: Weighted Geometric Mean and Invariant Gradient Covariance for Federated Learning on Non-IID Data

论文作者

Zhu, Mike He, Ezzine, Léna Néhale, Liu, Dianbo, Bengio, Yoshua

论文摘要

联合学习是一种分布式的机器学习方法,它使共享服务器模型可以通过从空间分布的客户端孤岛中汇总本地计算的参数更新来学习。尽管在规模和隐私方面都能成功拥有优势,但联邦学习受到域转移问题的伤害,在这种情况下,学习模型无法概括地看不见数据分布的域,这些域是非i.i.i.d。关于训练领域。在这项研究中,我们提出了联合不变的学习一致性(FEDILC)方法,该方法利用了梯度协方差和Hessians的几何平均值来捕获环境之间的silo Inter-Silo和silo内部一致性,并解散了联合网络中的域移位问题。基准和实际数据集实验带来了证据,表明我们所提出的算法的表现优于常规基准和类似的联合学习算法。这与医疗保健,计算机视觉和物联网(IoT)等各个领域有关。该代码在https://github.com/mikemikezhu/fedilc上发布。

Federated learning is a distributed machine learning approach which enables a shared server model to learn by aggregating the locally-computed parameter updates with the training data from spatially-distributed client silos. Though successfully possessing advantages in both scale and privacy, federated learning is hurt by domain shift problems, where the learning models are unable to generalize to unseen domains whose data distribution is non-i.i.d. with respect to the training domains. In this study, we propose the Federated Invariant Learning Consistency (FedILC) approach, which leverages the gradient covariance and the geometric mean of Hessians to capture both inter-silo and intra-silo consistencies of environments and unravel the domain shift problems in federated networks. The benchmark and real-world dataset experiments bring evidence that our proposed algorithm outperforms conventional baselines and similar federated learning algorithms. This is relevant to various fields such as medical healthcare, computer vision, and the Internet of Things (IoT). The code is released at https://github.com/mikemikezhu/FedILC.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源