With high dimensional longitudinal and functional data becoming much more common, there is a strong need for methods of estimating large covariance matrices. This task is potentially difficult due to the instability of sample covariance matrices in high dimensions, in addition to the desire to constrain estimates to be positive-definite. A modified Cholesky decomposition of the precision matrix permits covariance estimation via a two-dimensional varying coefficient model with an unconstrained parameter space.
Regularization improves stability of estimates in high dimensions as well as in the case where functional data are sparse and individual curves are sampled at different and possibly unequally spaced time points. We propose a general approach to the potentially ill-posed problem using penalized tensor product B-splines. By using a large number of equidistant knots, simple difference penalties allow us to sidestep the difficulty of selecting the number and position of the knots. These penalties lead to null models presented in the literature. We present numerical results and data analysis to illustrate the utility of the proposed method.
|