LASSO and Iterative Feature Selection: Oracle Inequalities and Numerical Performances
Résumé
We propose a general family of algorithms for regression estimation with quadratic loss. Our algorithms is able to select relevant functions into a large dictionary. We prove that some algorithms that have already been studied (Tibshirani's LASSO, Iterative Feature Selection, among others) belong to our family. We prove oracle-type inequalities in some particular cases, and compare numerical performances of LASSO and Iterative Feature Selection on a toy example.
Origine : Fichiers produits par l'(les) auteur(s)