Online Program Home
  My Program

Abstract Details

Activity Number: 49 - Statistical Inference for Large-Scale Financial Data
Type: Invited
Date/Time: Sunday, July 30, 2017 : 4:00 PM to 5:50 PM
Sponsor: IMS
Abstract #322139
Title: Testing and Scoring High-dimensional Covariance Matrices When Heteroscedasticity is Present
Author(s): Xinghua Zheng* and Xinxin Yang and Jiaqi Chen and Hua Li
Companies: HKUST and HKUST and Harbin Institute of Technology and Chang Chun University
Keywords: high dimension ; covariance matrix ; heteroscedasticity ; self-normalization ; central limit theorem
Abstract:

We study tests for high-dimensional covariance matrices when data exhibit heteroscedasticity. The observation is modeled as a product between a heteroscedastic scalar and an iid multidimensional random vector. We aim to test the covariance matrix up to the heteroscedastic scalar. To remove the heteroscedasticity, we self-normalize the observations and establish a CLT for the linear spectral statistics (LSS) of the sample covariance matrix based on the self-normalized observations. The CLT differs from the existing CLTs for the LSS of the usual sample covariance matrix (Bai and Silversten (2004, AoP), Najim and Yao (2013, AoAP). Tests based on the new CLT neither assume a specific parametric distribution for the data nor involve extra terms containing the fourth moment. Empirically, we use our tests to evaluate different predictors of the covariance matrix of the S&P 500 Financials sector stock returns. The results show that: 1) self-normalizing the observations can improve the prediction; 2) compared with sparsity, an approximate factor model is more suitable for stock returns.

Based on joint work with Xinxin Yang, Jiaqi Chen and Hua Li.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association