Abstract Details
Activity Number:
|
136
|
Type:
|
Contributed
|
Date/Time:
|
Monday, August 5, 2013 : 8:30 AM to 10:20 AM
|
Sponsor:
|
IMS
|
Abstract - #307964 |
Title:
|
Law of Log Determinant of Sample Covariance Matrix and Optimal Estimation of Differential Entropy for High-Dimensional Gaussian Distributions
|
Author(s):
|
Tengyuan Liang*+ and Tony Cai and Harrison Zhou
|
Companies:
|
University of Pennsylvania and University of Pennsylvania and Yale University
|
Keywords:
|
asymptotic optimality ;
covariance matrix ;
determinant ;
differential entropy ;
minimax lower bound ;
sharp minimaxity
|
Abstract:
|
Differential entropy and log determinant of the covariance matrix of a multivariate Gaussian distribution have many applications in coding, communications, signal processing and statistical inference. In this paper we consider in the high dimensional setting optimal estimation of the differential entropy and the log-determinant of the covariance matrix. We first establish a central limit theorem for the log determinant of the sample covariance matrix in the high dimensional setting where the dimension $p$ can grow with the sample size $n$. An estimator of the differential entropy and the log determinant is then considered. Optimal rate of convergence is obtained. It is shown that in the case $p/n \rightarrow 0$ the estimator is asymptotically sharp minimax. The ultra-high dimensional setting where $p > n$ is also discussed. This joint work with Tony Cai and Harrison Zhou.
|
Authors who are presenting talks have a * after their name.
Back to the full JSM 2013 program
|
2013 JSM Online Program Home
For information, contact jsm@amstat.org or phone (888) 231-3473.
If you have questions about the Continuing Education program, please contact the Education Department.
The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.
Copyright © American Statistical Association.