Online Program Home
  My Program

Abstract Details

Activity Number: 663 - New Developments in Modern Statistical Estimation Theory
Type: Contributed
Date/Time: Thursday, August 3, 2017 : 10:30 AM to 12:20 PM
Sponsor: IMS
Abstract #324186 View Presentation
Title: Adaptation in Log-Concave Density Estimation
Author(s): Arlene Kyoung Hee Kim* and Adityanand Guntuboyina and Richard J. Samworth
Companies: Sungshin Women's University and UC Berkeley and Statistical Laboratory, University of Cambridge
Keywords: log-concavity ; MLE ; adaptation ; rates of convergence ; Marshall's inequality ; bracketing entropy
Abstract:

The log-concave maximum likelihood estimator of a density on the real line based on a sample of size n is known to attain the minimax optimal rate of convergence of O(n^(?4/5)) with respect to, e.g., squared Hellinger distance. In this paper, we show that it also enjoys attractive adaptation properties, in the sense that it achieves a faster rate of convergence when the logarithm of the true density is k-affine (i.e. made up of k affine pieces), provided k is not too large. Our results use two different techniques: the first relies on a new Marshall's inequality for log-concave density estimation, and reveals that when the true density is close to log-linear on its support, the log-concave maximum likelihood estimator can achieve the parametric rate of convergence in total variation distance. Our second approach depends on local bracketing entropy methods, and allows us to prove a sharp oracle inequality, which implies in particular that the rate of convergence with respect to various global loss functions, including Kullback--Leibler divergence, is O((k/n)(log(n))^(5/4)) when the true density is log-concave and its logarithm is close to k-affine.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association