Online Program Home
My Program

Abstract Details

Activity Number: 307 - Novel Approaches for Analyzing Dynamic Networks
Type: Contributed
Date/Time: Tuesday, July 30, 2019 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #301716
Title: Dynamic Stochastic Mirror Descent with Statistical Applications
Author(s): Shih-Kang Chao* and Guang Cheng
Companies: University of Missouri-Columbia and Purdue Statistics
Keywords: stochastic gradient descent; stochastic approximation; big data; online learning; mirror descent; stability analysis
Abstract:

Stochastic gradient descent (SGD) is a popular algorithm that can handle extremely large data sets due to its low computational cost at each iteration and low memory requirement. Asymptotic distributional results of SGD are very well-known (Kushner and Yin, 2003). However, a major drawback of SGD is that it does not adapt well to the underlying structure of the solution, such as sparsity. Thus, variations of SGD have been developed, and many of them are based on the concept of dynamic stochastic mirror descent (d-SMD). In this paper, rigorous distributional analysis on d-SMD with constant step size is developed. To this goal, a novel continuous mapping theorem type result for a sequence of conjugates of the local Bregman divergence is developed for characterizing the asymptotic distribution. As a main application, our results shed light on the key statistical properties of l1-norm based d-SMD algorithm, including the bias, variable selection consistency and asymptotic distribution.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program