Abstract:
|
Inference concerning Gaussian graphical models involves pairwise conditional dependencies on Gaussian random variables. In such a situation, regularization of a certain form is often employed to treat an overparameterized model, imposing challenges to inference. In this talk, we will present a constrained maximum likelihood method for inference, with a focus of alleviating the impact of regularization on inference. For general composite hypotheses, we unregularize hypothesized parameters whereas regularizing nuisance parameters through a L0-constraint controlling the degree of sparseness. For the likelihood ratio test, the corresponding distribution is the chi-square or normal, depending on if the co-dimension of a test is finite or increases with the sample size. This goes beyond the classical Wilks phenomenon. Some numerical results will be discussed, in additional to an application to network analysis.
|