论文标题
专业的专家学习专家
Specialized federated learning using a mixture of experts
论文作者
论文摘要
在联合学习中,客户共享一个全球模型,该模型已接受了分散的本地客户数据的培训。尽管联合学习在无法共享或集中数据时表现出重要的希望作为关键方法,但当前方法显示有限的隐私属性,并且在应用于常见的现实世界情景时存在缺点,尤其是当客户端数据是异构的时。在本文中,我们提出了一种替代方法,以在联合环境中为每个客户学习个性化模型,其概括能力比以前的方法更大。为了实现这种个性化,我们提出了一个联合学习框架,使用专家组合,将经过本地训练的模型的专业性质与全球模型的通才知识相结合。我们在具有不同数据异质性的各种数据集上评估了我们的方法,我们的结果表明,专家模型的混合物更适合作为这些设置中的设备的个性化模型,表现优于微调的全球模型和本地专家。
In federated learning, clients share a global model that has been trained on decentralized local client data. Although federated learning shows significant promise as a key approach when data cannot be shared or centralized, current methods show limited privacy properties and have shortcomings when applied to common real-world scenarios, especially when client data is heterogeneous. In this paper, we propose an alternative method to learn a personalized model for each client in a federated setting, with greater generalization abilities than previous methods. To achieve this personalization we propose a federated learning framework using a mixture of experts to combine the specialist nature of a locally trained model with the generalist knowledge of a global model. We evaluate our method on a variety of datasets with different levels of data heterogeneity, and our results show that the mixture of experts model is better suited as a personalized model for devices in these settings, outperforming both fine-tuned global models and local specialists.