Abstract:
|
Fr\'{e}chet regression has received considerable scholarly attention to encounter metric-space random object responses that are complex and non-Euclidean data, such as matrices, graphs, and probability functions. However, several unresolved questions remain about the development of Fr\'{e}chet sufficient dimension reduction in ultra-high dimension. This paper studies Fr\'{e}chet sufficient variable selection with graph structure among multivariate Euclidean predictors. We propose a penalized deference of trace loss to avoid directly utilizing the inverse of a large covariance matrix. Our proposed estimation can be easily applied to high-dimensional predictors while implementing the prior graph information of predictors to improve accuracy and consistency. Theoretically, we derive the asymptotic consistency of the estimates of coefficients in sufficient dimension reduction as well as the oracle property of variable selection. We demonstrate the superior finite-sample performance of our proposals over existing methods through comprehensive simulations and data analysis.
|