Lower bounds and aggregation in density estimation
Résumé
In this paper we prove the optimality of an aggregation procedure. We prove lower bounds for aggregation of model selection type of $M$ density estimators for the Kullback-Leiber divergence (KL), the Hellinger's distance and the $L_1$-distance. The lower bound, with respect to the KL distance, can be achieved by the on-line type estimate suggested, among others, by Yang (2000). Combining these results, we state that $\log M/n$ is an optimal rate of aggregation in the sense of Tsybakov (2003), where $n$ is the sample size.
Fichier principal
LowerBoundsAndAggregationInDensityEstimation.pdf (184.27 Ko)
Télécharger le fichier
Loading...