## Page Not Found

Page not found. Your pixels are in another canvas.

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Page not found. Your pixels are in another canvas.

About me

This is a page not in th emain menu

** Published:**

This post will show up by default. To disable scheduling of future posts, edit `config.yml`

and set `future: false`

.

** Published:**

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

** Published:**

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

** Published:**

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

** Published:**

With F.Portier. (PDF)

To appear in *Statistics and Computing*. With F.Portier and J.Segers. (arXiv),(PDF),(code)

To appear at ICML2021. With H.Jalalzai.(PDF)

With F.Portier. (PDF)

** Published:**

This is a joint work with François Portier and Johan Segers (arXiv)

** Published:**

This is a joint work with François Portier and Johan Segers (arXiv)

** Published:**

This is a joint work with François Portier and Johan Segers (arXiv)

graduate course, *Télécom Paris*

Many statistical learning problems (calculating an estimator, a classifier, etc.) boil down to the minimization of a functional, typically an empirical risk. Optimization methods are therefore central to the “practical” part of statistical learning. In this module, the students will discover not only the theoretical foundations that are a continuation of the optimization course followed in first semester, but also different techniques to deal specifically with the case of massive data.

graduate course, *Télécom Paris*

In many situations, the data available to the statistician are so complex that, at least in the first analysis, they escape any parametric modelling. The objective of this course is to present less rigid statistical techniques, as well as the theoretical issues inherent in their implementation: the counterpart of the increased flexibility of non-parametric approaches lies in the risk of “over-fitting” the model to the data. Through examples, the “minimax” perspective for non-parametric estimation, the “bias/variance” trade-off depending on the “complexity” of the model, and the statistical learning paradigm, “empirical risk minimization”, will be discussed.

graduate course, *Télécom Paris*

In this course, we will first discuss the simple linear (least squares) model before presenting the general framework, which includes logistic regression. We will then consider the estimation and testing problems in these models. Finally, we will present the problem of variable selection in such a context, relying mainly on L1 (Lasso) regularization/penalization and on greedy selection methods.

graduate course, *Télécom Paris*

The convergence modes for random sequences will be reviewed and further developed. Convergence in law will be particularly studied. Applications in asymptotic statistics will be discussed. The theory of discrete-time martingales is an essential tool for stochastic calculus. This theory will be introduced with applications to Markov chains.

graduate course, *Télécom Paris*

The objective of this course is to give students the tools needed to practice statistics and data analysis, by explaining the fundamental concepts: models, estimators, point estimation, interval estimation, construction principles (moments, likelihood), eligibility criteria (bias/variance, Cramer-Rao limits), decision procedures, hypothesis testing. The lectures will be illustrated with practical examples from statistical data analysis.