Lien 
vers le site de l'ENS
ÉCOLE NORMALE SUPÉRIEUREPARIS
Lien vers l'accueil
lancer la recherche

» Conférences d’après mars 2011 : nouveau site

1443

Journée Mathematical Foundations of Learning Theory

< précédent | suivant >

Sparsity in High-Dimensional Learning Problems
Vladimir Koltchinskii (Georgia Institute of Technology)

2 juin 2006

We study penalized empirical risk minimization with convex loss over the linear span of a large finite set H of base functions. The penalty is based on the lp -norm of the vector of coefficients with p = 1 + c/ log N, where N is the cardinality of H. We prove several inequalities that directly relate "the degrees of sparsity" of empirical and true solutions of such problems and show what impact the sparsity has on on the excess risk bounds and on the accuracy of estimation of the vector of coefficients. We discuss several other problems, such as data-driven choice of regularization parameter that provides adaptation to unknown sparsity of the true solution as well as the problem of adaptation to linear dependencies in the set H. We also discuss the connections of these results to recent work on aggregation of statistical estimators (Tsybakov and coauthors) and to sparse recovery problems in computational harmonic analysis (Donoho, Candes, Tao among others).

Écouter
pictogrammeformat audio mp3 - ??? (erreur acces)

- Visualiser
- Télécharger
pictogrammeformat quicktime mov, vidéo à la demande

Télécharger
pictogrammeformat mp4, vidéo à télécharger - 132.31 Mo

Télécharger
pictogrammeformat windows media video - 84.34 Mo

Vladimir Koltchinskii Vladimir Koltchinskii (Georgia Institute of Technology)