Online Program Home
My Program

Abstract Details

Activity Number: 354 - Topics in Machine Learning
Type: Contributed
Date/Time: Tuesday, July 31, 2018 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #330176 Presentation
Title: Optimization Over Nonconvex Constraints
Author(s): Wooseok Ha* and Rina Foygel Barber
Companies: and University of Chicago
Keywords: Alternating minimization; Local concavity coefficients; Low-rank estimation; Nonconvex constraints; Prox-regular set; Gradient descent

Many problems in modern statistics can be formulated as an optimization problem with structured constraints, where the constraints often exhibit nonconvexity such as sparsity or low rank. However, working with nonconvex constraints presents challenges from both a theoretical and practical point of view. In this talk, we discuss a convergence behavior on two widely used algorithms, projected gradient descent and alternating minimization method, in the presence of nonconvex constraints. A major tool allowing to handle the nonconvex constraints is the local concavity coefficient, which aims to measure the concavity of a general nonconvex set. In the setting of alternating minimization, our result further reveals important distinction between alternating and non-alternating methods. We demonstrate our framework on a range of specific examples with rank-constrained variables, including factor model and multitask regression.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program