Online Program Home
My Program

Abstract Details

Activity Number: 579
Type: Invited
Date/Time: Wednesday, August 3, 2016 : 2:00 PM to 3:50 PM
Sponsor: ENAR
Abstract #318129
Title: CoCoLasso for High-Dimensional Error-in-Variables Regression
Author(s): Hui Zou* and Abhirup Datta
Companies: University of Minnesota and University of Minnesota
Keywords: Lasso ; Measurement error ; Convexity

We often face corrupted data in many applications where missing data and measurement errors cannot be ignored. Loh and Wainwright (2012, AoS) proposed an interesting non-convex modification of the Lasso for doing high-dimensional regression with noisy and missing data. The non-convexity formulation brings up the issue of multiple local minimizers. Through some careful analysis, they showed that a projected gradient descent algorithm will converge in polynomial time to a small neighborhood of the set of all global minimizers. In this article, we argue that the virtues of convexity contribute fundamentally to the success and popularity of the Lasso. In light of this, we propose a new method named CoCoLasso that is convex and can handle a general class of corrupted datasets including the cases of additive measurement error and random missing data. CoCoLasso automatically enjoys the benefits of convexity for high-dimensional regression. We derive the statistical error bounds of CoCoLasso as well as its sign-consistent selection property. We demonstrate the superior performance of our method over the non-convex approach in Loh and Wainwright (2012) by simulation studies.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

Copyright © American Statistical Association