Abstract:
|
In many situations regression analysis is mostly concerned with inferring about the conditional mean of the response given the predictor, and less concerned with the other aspects of the conditional distribution. In this paper we develop the dimension reduction methods that incorporate this consideration. We introduce the notion of the Central Mean Subspace (CMS), a natural inferential object for dimension reduction when the mean function alone is of interest. We will study the properties of the CMS and develop various methods to estimate it. These methods include a new class of estimators which require fewer conditions than principle Hessian directions (pHd) and which display a clear advantage when one of the conditions for pHd is violated. CMS also reveals a transparent distinction among the existing methods for dimension reduction: Ordinary Least Square, pHd, Inverse Regression, and Sliced Averaged Variance Estimator. The new methods will be applied to analyze a data set involving recumbent cows. We will also discuss the estimation and inference issues surrounding these methods.
|