Activity Number:
|
120
- SPEED: Variable Selection and Networks
|
Type:
|
Contributed
|
Date/Time:
|
Monday, July 31, 2017 : 8:30 AM to 10:20 AM
|
Sponsor:
|
Section on Statistical Learning and Data Science
|
Abstract #324289
|
View Presentation
|
Title:
|
Parsimonious and Efficient Construction of Composite Likelihood Equations by L1-Penalization
|
Author(s):
|
Zhendong Huang*
|
Companies:
|
|
Keywords:
|
composite likelihood estimation ;
high-dimensional data ;
sparsity-inducing penalization
|
Abstract:
|
The astonishing growth in size and complexity of modern data challenges the applicability of traditional likelihood-based inference. Composite likelihood (CL) addresses the difficulties related to model selection and computational intractability of the full likelihood by combining a number of low-dimensional likelihood objects into a single objective function used for inference. In this paper, we propose a new procedure to combine sub-likelihood objects from a large set of feasible candidates whilst carrying out parameter estimation. Our method finds CL estimating equations by minimizing the estimated distance from the full likelihood subject to a constraint representing the afforded computing cost. The resulting CL is sparse since it contains a relative small number of informative sub-likelihoods while noisy or redundant components are neglected. An asymptotic theory for the new estimator is provided in the setting where both sample size and data dimension grow. The new procedure is implemented by a fast least-angle algorithm and its finite sample properties are illustrated through numerical simulations.
|
Authors who are presenting talks have a * after their name.