vers le site de l'ENS
Lien vers l'accueil
lancer la recherche

» Conférences d’après mars 2011 : nouveau site


Journée Mathematical Foundations of Learning Theory

< précédent | suivant >

Regularization of Kernel Methods by Decreasing the Bandwidth of the Gaussian Kernel
Jean-Philippe Vert (École des Mines)

2 juin 2006

We consider learning algorithms that minimize an empirical risk regularized by the norm in the reproducing kernel Hilbert space of the Gaussian kernel. The conditions on the loss function for Bayes consistency of such methods have been studied recently when the regularization term asymptotically vanishes as the sample size increases. Here we study the different situation where the regularization term does not vanish, but the bandwidth of the Gaussian kernel instead decreases with the sample size. We will explicit the asymptotic limit of the function selected by the algorithm, give conditions on the loss function to ensure Bayes consistency, and provide non-asymptotic learning bounds in this case. We will deduce in particular the consistency of the one-class support vector machine algorithm as a density level set estimator.
(Joint work with Régis Vert.)

pictogrammeformat pdf - 313.71 Ko

pictogrammeformat audio mp3 - ??? (erreur acces)

- Visualiser
- Télécharger
pictogrammeformat quicktime mov, vidéo à la demande

pictogrammeformat mp4, vidéo à télécharger - 118.32 Mo

pictogrammeformat windows media video - 85.81 Mo

Jean-Philippe Vert Jean-Philippe Vert (École des Mines)
Centre for Computational Biology. Since 2008, Senior Researcher and Adjunct Director. Department of Cancer Computational Genomics, Bioinformatics, Biostatistics and Epidemiology, Insitut Curie/Mines ParisTech/INSERM.