A Quadrature Rule combining Control Variates and Adaptive Importance Sampling
Advances in Neural Information Processing Systems 2022. With F.Portier, J.Segers and A.Zhuman. (arXiv)
Advances in Neural Information Processing Systems 2022. With F.Portier, J.Segers and A.Zhuman. (arXiv)
Statistics and Computing, 31. With F.Portier and J.Segers. (arXiv),(PDF),(code)
International Conference on Machine Learning 2021. With H.Jalalzai.(PDF,arXiv)
Journal of Machine Learning Research 2022. With F.Portier. (PDF)
graduate course, Télécom Paris
Many statistical learning problems (calculating an estimator, a classifier, etc.) boil down to the minimization of a functional, typically an empirical risk. Optimization methods are therefore central to the “practical” part of statistical learning. In this module, the students will discover not only the theoretical foundations that are a continuation of the optimization course followed in first semester, but also different techniques to deal specifically with the case of massive data.
graduate course, Télécom Paris
In many situations, the data available to the statistician are so complex that, at least in the first analysis, they escape any parametric modelling. The objective of this course is to present less rigid statistical techniques, as well as the theoretical issues inherent in their implementation: the counterpart of the increased flexibility of non-parametric approaches lies in the risk of “over-fitting” the model to the data. Through examples, the “minimax” perspective for non-parametric estimation, the “bias/variance” trade-off depending on the “complexity” of the model, and the statistical learning paradigm, “empirical risk minimization”, will be discussed.
graduate course, Télécom Paris
In this course, we will first discuss the simple linear (least squares) model before presenting the general framework, which includes logistic regression. We will then consider the estimation and testing problems in these models. Finally, we will present the problem of variable selection in such a context, relying mainly on L1 (Lasso) regularization/penalization and on greedy selection methods.
graduate course, Télécom Paris
The convergence modes for random sequences will be reviewed and further developed. Convergence in law will be particularly studied. Applications in asymptotic statistics will be discussed. The theory of discrete-time martingales is an essential tool for stochastic calculus. This theory will be introduced with applications to Markov chains.
graduate course, Télécom Paris
The objective of this course is to give students the tools needed to practice statistics and data analysis, by explaining the fundamental concepts: models, estimators, point estimation, interval estimation, construction principles (moments, likelihood), eligibility criteria (bias/variance, Cramer-Rao limits), decision procedures, hypothesis testing. The lectures will be illustrated with practical examples from statistical data analysis.