Abstract:
|
We consider the problem of structure learning in graphical models corresponding to $p$-variate Gaussian random variables, where the dimension $p$ might be large. The conditional independence structure of the random variables can be elicited from the zero structure of the corresponding precision (or, inverse covariance) matrix, and Gaussian graphical models act as important tools to explore the same in this context. We follow a Bayesian approach for graphical model structure learning using shrinkage priors on the precision matrix. We develop an efficient MCMC algorithm for sampling from the posterior distribution and also provide theoretical guarantees for posterior convergence under the $L_2$-norm.
|