Online Program Home
  My Program

Abstract Details

Activity Number: 184 - SPEED: Variable Selection and Networks
Type: Contributed
Date/Time: Monday, July 31, 2017 : 11:35 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #325299
Title: Parsimonious and Efficient Construction of Composite Likelihood Equations by L1-Penalization
Author(s): Zhendong Huang*
Companies:
Keywords: composite likelihood estimation ; high-dimensional data ; sparsity-inducing penalization
Abstract:

The astonishing growth in size and complexity of modern data challenges the applicability of traditional likelihood-based inference. Composite likelihood (CL) addresses the difficulties related to model selection and computational intractability of the full likelihood by combining a number of low-dimensional likelihood objects into a single objective function used for inference. In this paper, we propose a new procedure to combine sub-likelihood objects from a large set of feasible candidates whilst carrying out parameter estimation. Our method finds CL estimating equations by minimizing the estimated distance from the full likelihood subject to a constraint representing the afforded computing cost. The resulting CL is sparse since it contains a relative small number of informative sub-likelihoods while noisy or redundant components are neglected. An asymptotic theory for the new estimator is provided in the setting where both sample size and data dimension grow. The new procedure is implemented by a fast least-angle algorithm and its finite sample properties are illustrated through numerical simulations.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association