Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 174 - Statistical Optimality in High-Dimensional Models and Tradeoffs with Computational Complexity, Privacy and Communication Constraints
Type: Contributed
Date/Time: Tuesday, August 4, 2020 : 10:00 AM to 2:00 PM
Sponsor: IMS
Abstract #313472
Title: Distributed Gaussian Mean Estimation Under Communication Constraints: Optimal Rates and Communication-Efficient Algorithms
Author(s): Hongji Wei* and Tony Cai
Companies: Wharton Department of Statistics and University of Pennsylvania
Keywords: Distributed learning; minimax rate; communication constraints; Gaussian mean estimation
Abstract:

We study distributed estimation of a Gaussian mean under communication constraints in a decision theoretical framework. Minimax rates of convergence, which characterize the tradeoff between the communication costs and statistical accuracy, are established in both the univariate and multivariate settings. Communication-efficient and statistically optimal procedures are developed. In the univariate case, the optimal rate depends only on the total communication budget, so long as each local machine has at least one bit. However, in the multivariate case, the minimax rate depends on the specific allocations of the communication budgets among the local machines. Although optimal estimation of a Gaussian mean is relatively simple in the conventional setting, it is quite involved under the communication constraints, both in terms of the optimal procedure design and lower bound argument. The techniques developed in this paper can be of independent interest. An essential step is the decomposition of the minimax estimation problem into two stages, localization and refinement. This critical decomposition provides a framework for both the lower bound analysis and optimal procedure design.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program