vers le site de l'ENS
Lien vers l'accueil
lancer la recherche

» Conférences d’après mars 2011 : nouveau site


Atelier Apprentissage 2006–2007

< précédent | suivant >

Minimax bounds on the distortion of empirical designed vector quantizers
Andràs Antos (académie des sciences, Hongrie)

20 novembre 2006

It is shown by earlier results that the minimax expected (test) distortion redundancy of empirical vector quantizers with three or more levels designed from $n$ independent and identically distributed data points is at least $Omega(1/sqrt{n})$ for the class of distributions on a bounded set. In this paper, a much simpler construction and proof for this are given with much better constants. There are similar bounds for the training distortion of the empirically optimal vector quantizer with three or more levels. These rates, however, do not hold for a one-level quantizer. Here the two-level quantizer case is clarified, showing that it already shares the behavior of the general case. Given that the minimax bounds are proved using a construction that involves discrete distributions, one suspects that for the class of distributions with uniformly bounded continuous densities, the expected distortion redundancy might decrease as $o(1/sqrt{n})$ uniformly. It is shown as well that this is not so, proving that the lower bound for the expected test distortion remains true for these subclasses.

pictogrammeformat audio mp3 - 0 O

- Visualiser
- Télécharger
pictogrammeformat quicktime mov, vidéo à la demande / streaming

pictogrammeformat vidéo mp4 à télécharger - 158.08 Mo

pictogrammeformat vidéo windows media video à télécharger - 110.52 Mo

Andràs Antos Andràs Antos (académie des sciences, Hongrie)
Computer and Automation Research Institute of the Hungarian Academy of Sciences