![]() ![]() Our convergence results take into account time-varying perturbation noises used by the users, and data and user sampling. We study the convergence rate of SCAFFOLD under differential privacy constraint. data than previous federated averaging algorithms. Specifically, we study SCAFFOLD, a popular federated learning algorithm, that has shown better performance on dealing with non-i.i.d. data and the randomized user participation in the training process. In this work, we propose a differentially private mechanism that simultaneously deals with user-drift caused by non-i.i.d. More specifically, interactions between the PS and users must satisfy differential privacy (DP) for each user. The PS is assumed to be honest-but-curious so that the data at users need to be kept private from the PS. More specifically, users collaborate with the parameter server (PS) to jointly train a machine learning model using their local datasets that are non-i.i.d. N2 - In this paper, we consider the problem of differentially private federated learning with statistical data heterogeneity. This work has been supported in part by NSF Grants CAREER 1651492, CNS 1715947, CCF 2100013 and the 2018 Keysight Early Career Professor Award. T1 - Differentially Private Federated Learning with Drift Control We also conduct experiments to confirm our theoretical findings on real world dataset. We propose two time-varying noise allocation schemes in order to achieve better convergence rate and satisfy a total DP privacy budget. ![]() We also conduct experiments to confirm our theoretical findings on real world dataset.Ībstract = "In this paper, we consider the problem of differentially private federated learning with statistical data heterogeneity. In this paper, we consider the problem of differentially private federated learning with statistical data heterogeneity. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |