|
Activity Number:
|
31
|
|
Type:
|
Contributed
|
|
Date/Time:
|
Sunday, August 2, 2009 : 2:00 PM to 3:50 PM
|
|
Sponsor:
|
Section on Statistical Learning and Data Mining
|
| Abstract - #304826 |
|
Title:
|
Norm Dependence in Regularized Estimation of Large Covariance Matrices
|
|
Author(s):
|
Iain Johnstone*+
|
|
Companies:
|
Stanford University
|
|
Address:
|
Department of Statistics, Stanford, CA, 94305,
|
|
Keywords:
|
large covariance matrix ; Optimal rate of convergence ; Minimax estimation
|
|
Abstract:
|
Consider estimation of a covariance matrix when the number of variables and observations are both large. Assume that the covariance matrix belongs to a class in which the covariances decay at least at a given polynomial rate away from the diagonal. Over such classes, the optimal (minimax) rate of estimation was recently found by Cai, Zhang and Zhou for two error measures: operator norm and Frobenius norm. An interesting finding was that the rates differed for the two norms. We consider a one parameter family of norms on eigenvalues that contains the operator and Frobenius norms as particular cases. We determine the optimal rate and tapering point, and show that both vary smoothly with r along a path that connects the operator and Frobenius cases. The results provide some insight into how the optimal rate of estimation depends on the particular norm.
|
- The address information is for the authors that have a + after their name.
- Authors who are presenting talks have a * after their name.
Back to the full JSM 2009 program |