Abstract:
|
Many recent breakthroughts have been made in the field of selective inference. I will briefly review some of the advances in this field, and will emphasize that much of this work depends critically on the assumptions of an i.i.d. homoskedastic normal error model (which is also assumed to be independent of the predictor variables). I will discuss asymptotic theory that certifies, in low-dimensional settings (and some restricted high-dimensional settings), that the tools of selective inference will remain valid outside of the normal error model.
I will then shift focus, and talk about "truly" distribution-free tools, stemming from the classic notion of sample splitting, and a much less well-known but also powerful framework for inference based on conformal prediction theory.
This talk represents an amalgamation of work with Larry Wasserman, Robert Tibshirani, Jing Lei, Max G'Sell, Alessandro Rinaldo.
|