Adaboost and robust 1-bit compressed sensing

Geoffrey Chinot (ETH Zürich)

Jun 02. 2022, 10:00 — 10:45

We consider binary classification in robust one-bit compressed sensing with adversarial errors. It is assumed that the model is overparameterized and that the parameter of interest is effectively sparse. We consider AdaBoost, and, through its relation to the (max-\ell_1-margin)-classifier, derive prediction error bounds. The developed theory is general and allows for heavy-tailed feature distributions, requiring only a weak moment assumption and an anti-concentration condition. We show improved convergence rates when the features satisfy a small deviation lower bound. In case of Gaussian features we obtain convergence rates that are comparable to state of the art results in robust one-bit compressed sensing. In particular, these results provide an explanation why interpolating adversarial noise can be harmless for classification problems.


This is joint work with F. Kuchelmeister, Matthias Loeffler and S. van de Geer.

Further Information
ESI Boltzmann Lecture Hall
Associated Event:
Computational Uncertainty Quantification: Mathematical Foundations, Methodology & Data (Thematic Programme)
Clemens Heitzinger (TU Vienna)
Fabio Nobile (EPFL Lausanne)
Robert Scheichl (U Heidelberg)
Christoph Schwab (ETH Zürich)
Sara van de Geer (ETH Zürich)
Karen Willcox (U of Texas, Austin)