Online Program Home
My Program

Abstract Details

Activity Number: 546 - Recent Advances in Time Series and Point Process
Type: Invited
Date/Time: Wednesday, July 31, 2019 : 2:00 PM to 3:50 PM
Sponsor: Business and Economic Statistics Section
Abstract #300551
Title: A Class of Generalized Self-Normalizers for Inference of Time Series and Its Optimal Weighting
Author(s): Ting Zhang*
Companies: Boston University
Keywords: Gateaux derivatives; generalized self-normalizers; probabilistically linear and nonlinear parameters; recursive estimators; von Mises expansion

Self-normalization has been celebrated for its ability to avoid direct estimation of the nuisance asymptotic variance and its versatility in handling the mean and other quantities. However, the self-normalizer in its conventional form uses only recursive estimators of one direction, and thus may exhibit certain degrees of asymmetry. We consider a novel approach in generalizing the conventional self-normalizer to involve recursive estimators of both directions. For the generalized class of self-normalizers, we explore its time-symmetric subspace and a data-driven weight choice that corresponds to confidence intervals with minimal lengths. We study the asymptotic behavior of such a data-driven weight choice through the von Mises expansion with Gateaux derivatives, and find an interesting dichotomy between linear and nonlinear quantities. This is a joint work with Liliya Lavitas.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program