Exploiting regularity in sparse Generalized Linear Models - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année :

Exploiting regularity in sparse Generalized Linear Models

(1) , (2) , (1) , (3)
1
2
3

Résumé

Generalized Linear Models (GLM) are a wide class ofregression and classification models, where the predictedvariable is obtained from a linear combination of the in-put variables. For statistical inference in high dimensions,sparsity inducing regularization have proven useful whileoffering statistical guarantees. However, solving the result-ing optimization problems can be challenging: even forpopular iterative algorithms such as coordinate descent, oneneeds to loop over a large number of variables. To mitigatethis, techniques known asscreening rulesandworking setsdiminish the size of the optimization problem at hand, eitherby progressively removing variables, or by solving a growingsequence of smaller problems. For both of these techniques,significant variables are identified by convex duality. In thispaper, we show that the dual iterates of a GLM exhibit aVector AutoRegressive (VAR) behavior after sign identifi-cation, when the primal problem is solved with proximalgradient descent or cyclic coordinate descent. Exploitingthis regularity one can construct dual points that offertighter control of optimality, enhancing the performance ofscreening rules and helping to design a competitive workingset algorithm.
Fichier principal
Vignette du fichier
spars2019.pdf (1.18 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02288859 , version 1 (13-10-2019)

Identifiants

  • HAL Id : hal-02288859 , version 1

Citer

Mathurin Massias, Samuel Vaiter, Alexandre Gramfort, Joseph Salmon. Exploiting regularity in sparse Generalized Linear Models. SPARS 2019 - Signal Processing with Adaptive Sparse Structured Representations, Jul 2019, Toulouse, France. ⟨hal-02288859⟩
129 Consultations
71 Téléchargements

Partager

Gmail Facebook Twitter LinkedIn More