Online Program Home
  My Program

Abstract Details

Activity Number: 339 - Model-Fitting, Likelihood-Based Inference, and Their Applications
Type: Contributed
Date/Time: Tuesday, August 1, 2017 : 10:30 AM to 12:20 PM
Sponsor: IMS
Abstract #324586
Title: On the Properties of the Gradient of Log Empirical Likelihood
Author(s): Sanjay Chaudhuri*
Companies: National University of Singapore
Keywords: Empirical likelihood ; Gradient ; Nonconvex optimisation ; Maximum empirical likelihood estimator ; Hamiltonian Monte Carlo ; BayesEL procedures
Abstract:

Empirical likelihood based methods are based on nonparametric estimator of the data distribution computed under constraints imposed by parameter dependent estimating equations. Thus, they efficiently combine the flexibility of a nonparametric procedure together with the interpretabilty of a parametric model. It is well known that even for simple regression models, the support of the empirical likelihood is a nonconvex set. Numerical procedures do not behave well for such supports. In most cases they are slow and require complicated tuning to converge to a optimum value. In recent times, there has been a considerable interest on the properties of the gradient of log empirical likelihoods. It has been shown that under mild conditions, with a high probability, at least one component of these gradient vectors diverge at the boundary of the support. In this talk we discuss such properties of the gradient in details. We would consider several potential use of the gradient vector in the application of empirical likelihood based methods in both frequentist and Bayesian paradigms.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association