论文标题

EFMVFL:有效而灵活的多方垂直联合学习,没有第三方

EFMVFL: An Efficient and Flexible Multi-party Vertical Federated Learning without a Third Party

论文作者

Huang, Yimin, Feng, Xinyu, Wang, Wanwan, He, Hao, Wang, Yukun, Yao, Ming

论文摘要

联合学习允许多个参与者在不披露其本地数据的情况下进行联合建模。垂直联合学习(VFL)处理参与者共享相同ID空间和不同特征空间的情况。在大多数VFL框架中,为了保护参与者本地数据的安全性和隐私,需要第三方生成同构加密密钥对并执行解密操作。这样,第三方被授予解密与模型参数有关的信息。但是,在现实世界中找到这样的可信实体并不容易。解决此问题的现有方法是通信密集型或不适合多方方案。通过结合秘密共享和同型加密,我们提出了一个新颖的VFL框架,而没有第三方EFMVFL,该框架支持向多位沟通开销低的参与者的灵活扩展,并且适用于广义线性模型。我们在逻辑回归和泊松回归下对框架进行实例化。理论分析和实验表明,我们的框架是安全,更高效且易于扩展到多个参与者的。

Federated learning allows multiple participants to conduct joint modeling without disclosing their local data. Vertical federated learning (VFL) handles the situation where participants share the same ID space and different feature spaces. In most VFL frameworks, to protect the security and privacy of the participants' local data, a third party is needed to generate homomorphic encryption key pairs and perform decryption operations. In this way, the third party is granted the right to decrypt information related to model parameters. However, it isn't easy to find such a credible entity in the real world. Existing methods for solving this problem are either communication-intensive or unsuitable for multi-party scenarios. By combining secret sharing and homomorphic encryption, we propose a novel VFL framework without a third party called EFMVFL, which supports flexible expansion to multiple participants with low communication overhead and is applicable to generalized linear models. We give instantiations of our framework under logistic regression and Poisson regression. Theoretical analysis and experiments show that our framework is secure, more efficient, and easy to be extended to multiple participants.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源