Abstract:
|
Regularized estimation techniques, and in particular, lasso, are widely used in high-dimensional models to obtain more reliable parameter estimates and improve prediction accuracy. However, valid inference for these procedures challenging due to their inherent bias. In this talk, we revisit the variable selection effect of lasso and present a general framework for asymptotically valid inference using lasso. We show that recent proposals for inference based either on de-biased, or de-sparsified, lasso estimates or exact post selection conditional on the selected model can be seen in light of the new findings on variables selection properties of lasso. We present empirical evidence in support of the proposed inference framework and demonstrate its applicability in high-dimensional settings.
|