Differentially Private Coordinate Descent for Composite Empirical Risk Minimization - Institut de Mathématiques et de Modélisation de Montpellier Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Differentially Private Coordinate Descent for Composite Empirical Risk Minimization

Résumé

Machine learning models can leak information about the data used to train them. To mitigate this issue, Differentially Private (DP) variants of optimization algorithms like Stochastic Gradient Descent (DP-SGD) have been designed to trade-off utility for privacy in Empirical Risk Minimization (ERM) problems. In this paper, we propose Differentially Private proximal Coordinate Descent (DP-CD), a new method to solve composite DP-ERM problems. We derive utility guarantees through a novel theoretical analysis of inexact coordinate descent. Our results show that, thanks to larger step sizes, DP-CD can exploit imbalance in gradient coordinates to outperform DP-SGD. We also prove new lower bounds for composite DP-ERM under coordinate-wise regularity assumptions, that are nearly matched by DP-CD. For practical implementations, we propose to clip gradients using coordinate-wise thresholds that emerge from our theory, avoiding costly hyperparameter tuning. Experiments on real and synthetic data support our results, and show that DP-CD compares favorably with DP-SGD.
Fichier principal
Vignette du fichier
paper.pdf (579.24 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03424974 , version 1 (10-11-2021)
hal-03424974 , version 2 (02-02-2022)
hal-03424974 , version 3 (21-10-2022)

Identifiants

Citer

Paul Mangold, Aurélien Bellet, Joseph Salmon, Marc Tommasi. Differentially Private Coordinate Descent for Composite Empirical Risk Minimization. ICML 2022 - 39th International Conference on Machine Learning, Jul 2022, Baltimore, United States. pp.14948-14978. ⟨hal-03424974v3⟩
115 Consultations
103 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More