Online Program Home
My Program

Abstract Details

Activity Number: 494
Type: Contributed
Date/Time: Wednesday, August 3, 2016 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Computing
Abstract #320665 View Presentation
Title: Accelerated Schemes for a Class of Variational Inequalities
Author(s): Yuyuan Ouyang* and Yunmei Chen and Guanghui Lan
Companies: and University of Florida and University of Florida
Keywords: Convex optimization ; Stochastic programming ; Variational inequalities ; Mirror-prox method ; Extragradient method
Abstract:

We propose a class of novel methods, namely the accelerated mirror-prox (AMP) methods, for solving a class of deterministic and stochastic monotone variational inequalities (VI). The main idea of the proposed algorithms is to incorporate a multi-step acceleration scheme into the mirror-prox method. For both deterministic and stochastic VIs, the developed AMP methods compute the weak solutions with the optimal iteration complexity. In particular, if the monotone operator in VI consists of the gradient of a smooth function, the iteration complexities of the AMP methods can be accelerated in terms of their dependence on the Lipschitz constant of the smooth function. For VIs with bounded feasible sets, the bounds of the iteration complexities of the AMP methods depend on the diameter of the feasible set. For unbounded VIs, we adopt the modified gap function introduced by Monteiro and Svaiter for solving monotone inclusion, and show that the iteration complexities of the AMP methods depend on the distance from the initial point to the set of strong solutions. We also demonstrate the advantages of the AMP methods over some existing algorithms through our preliminary numerical experiments


Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

 
 
Copyright © American Statistical Association