论文标题
ESMFL:有效且安全的联合学习模型
ESMFL: Efficient and Secure Models for Federated Learning
论文作者
论文摘要
如今,深度神经网络已广泛应用于各个领域。但是,深神经网络需要进行的大量数据收集揭示了潜在的隐私问题,并且还消耗了大量的通信带宽。为了解决这些问题,我们为联合学习分布式系统提出了一种隐私的方法,该方法在Intel Software Guard Extensions上操作,这是一组指令,以提高应用程序代码和数据的安全性。同时,加密的模型使变速箱开销更大。因此,我们通过稀疏来降低换向成本,它可以通过不同的模型体系结构实现合理的准确性。
Nowadays, Deep Neural Networks are widely applied to various domains. However, massive data collection required for deep neural network reveals the potential privacy issues and also consumes large mounts of communication bandwidth. To address these problems, we propose a privacy-preserving method for the federated learning distributed system, operated on Intel Software Guard Extensions, a set of instructions that increase the security of application code and data. Meanwhile, the encrypted models make the transmission overhead larger. Hence, we reduce the commutation cost by sparsification and it can achieve reasonable accuracy with different model architectures.