学会忘记:联邦学习中的用户级记忆消除是一种分散性的机器学习技术,在研究领域和现实市场都引起了广泛的关注。 然而,目前保护隐私的联邦学习方案只是为用户提供了一个安全的方式来贡献他们的私有数据,而没有留下一个方法来收回对模型更新的贡献。这种不可逆的设置可能会打破有关数据保护的规定,并增加数据抽取的风险。 为了解决这个问题,本文提出了联邦学习的一个新概念,叫做记忆消除。基于这个概念,我们提出了sysname,一个联邦学习框架,允许用户在训练模型中消除对私人数据的记忆。具体来说,sysname 中的每个用户都部署了一个可训练的虚拟梯度生成器。经过一系列的训练,生成器可以产生虚拟梯度来刺激机器学习模型的神经元,从而消除对特定数据的记忆。同时,我们证明 sysname 的附加存储消除服务并没有破坏联邦学习的一般流程或降低其安全性。
原文标题:Learn to Forget: User-Level Memorization Elimination in Federated Learning
原文:Federated learning is a decentralized machine learning technique that evokes widespread attention in both the research field and the real-world market. However, the current privacy-preserving federated learning scheme only provides a secure way for the users to contribute their private data but never leaves a way to withdraw the contribution to model update. Such an irreversible setting potentially breaks the regulations about data protection and increases the risk of data extraction. To resolve the problem, this paper describes a novel concept for federated learning, called memorization elimination. Based on the concept, we propose sysname, a federated learning framework that allows the user to eliminate the memorization of its private data in the trained model. Specifically, each user in sysname is deployed with a trainable dummy gradient generator. After steps of training, the generator can produce dummy gradients to stimulate the neurons of a machine learning model to eliminate the memorization of the specific data. Also, we prove that the additional memorization elimination service of sysname does not break the common procedure of federated learning or lower its security.
原文作者:Zhuo Ma
原文地址:https://arxiv.org/abs/2003.10933