Abstract:
|
Regression models, in which the observed features $X \in \R^p$ and the response $Y \in \R$ depend, jointly, on a lower dimensional, unobserved, latent vector $Z \in \R^K$, with $K\ll p$, are popular in a large array of applications, and mainly used for predicting a response from correlated features. In contrast, methodology and theory for inference on the regression coefficient $\beta\in \R^K$ relating $Y$ to $Z$ are scarce, since typically the un-observable factor $Z$ is hard to interpret. Furthermore, the determination of the asymptotic variance of an estimator of $\beta$ is a long-standing problem, with solutions known only in a few particular cases.
To address some of these outstanding questions, we develop inferential tools for $\beta$ in a class of factor regression models in which the observed features are signed mixtures of the latent factors. The model specifications are both practically desirable, in a large array of applications, render interpretability to the components of $Z$, and are sufficient for parameter identifiability.
|