Under misspecification, the maximum likelihood estimator is often not consistent. Similarly, the posterior distribution in Bayesian statistics also often leads to inconsistent estimation. To fix this issue, it has been suggested to replace the likelihood by other measure of distances between probability distributions. This leads to estimation methods known as "minimum distance estimation" (MDE). In this talk, I will focus on MDE when the metric is chosen in the Integral Probability Metric family (IMP). IPM include the Kolmogorov distance, the MMD and the Wasserstein distance. I will study the consistency of these estimators. The estimator based on the MMD criterion, in particular, enjoys very strong robustness properties to all kind of misspecification and contamination of the data. The talk will cover both frequentist and Bayesian methods. While the models covered in the talk will be elementary, there is a huge potential for more complex models. In particular, the MMD and Wasserstein methods were already successfully used to train deep generative networks.