We propose a framework for a penalized likelihood method to obtain estimates for multiple precision matrices. This framework uses a non-convex penalty to simultaneously detect groupings and produce estimates of similar precision matrices. Optimization is addressed using an algorithm that iterates between a k-means clustering step to update the groupings of related precision matrices and a block coordinate descent algorithm to update estimates of the precision matrices. When a ridge type penalty is used on the elements of the each precision matrix, each iteration of the block coordinate descent step has a closed form solution that depends on a spectral decomposition. When an l1 penalty is used on the elements of each precision matrix, each iteration of the the block coordinate descent step uses a proximal gradient descent algorithm to solve the equivalent of an elastic net penalized precision matrix estimation problem. Convergence properties, model selection, and other computational issues are also considered. Applications of our methods will be discussed in the context of discriminant analysis and Gaussian graphical models.