We consider binary classification in robust one-bit compressed sensing with adversarial errors. It is assumed that the model is overparameterized and that the parameter of interest is effectively sparse. We consider AdaBoost, and, through its relation to the (max-\ell_1-margin)-classifier, derive prediction error bounds. The developed theory is general and allows for heavy-tailed feature distributions, requiring only a weak moment assumption and an anti-concentration condition. We show improved convergence rates when the features satisfy a small deviation lower bound. In case of Gaussian features we obtain convergence rates that are comparable to state of the art results in robust one-bit compressed sensing. In particular, these results provide an explanation why interpolating adversarial noise can be harmless for classification problems.
This is joint work with F. Kuchelmeister, Matthias Loeffler and S. van de Geer.