Online Program Home
My Program

Abstract Details

Activity Number: 614 - Statistical Methods for Longitudinal and Other Dependent Data
Type: Contributed
Date/Time: Thursday, August 1, 2019 : 8:30 AM to 10:20 AM
Sponsor: Section on Nonparametric Statistics
Abstract #306602
Title: Adaptation in Log-Concave Density Estimation
Author(s): Oliver Feng* and Richard Samworth and Arlene Kyoung Hee Kim and Adityanand Guntuboyina
Companies: University of Cambridge and University of Cambridge and Sungshin University and University of California at Berkeley
Keywords: multivariate adaptation; log-concavity; bracketing entropy; contour separation; maximum likelihood estimation

We study the adaptation properties of the multivariate log-concave maximum likelihood estimator (MLE) over two subclasses of log-concave densities. The first consists of densities with polyhedral support whose logarithms are piecewise affine. The complexity of such densities f can be measured in terms of the sum ?(f) of the numbers of facets of the subdomains in the polyhedral subdivision of the support induced by f. Given n iid observations from a d-dimensional log-concave density with d = 2 or 3, we prove a sharp oracle inequality which implies that the Kullback--Leibler risk of the log-concave MLE for such densities is at most ?(f)/n, up to a polylogarithmic factor. The second type of subclass consists of densities whose contours are well-separated; these new classes are constructed to be affine invariant and turn out to contain a wide variety of densities, including those that satisfy Hölder regularity conditions. We prove a sharp oracle inequality which reveals that the log-concave MLE attains a Kullback--Leibler risk bound of order n^{?min((?+3)/(?+7),4/7)} when d = 3 over the class of ?-Hölder log-concave densities when 1 < ? ? 3, again up to a polylogarithmic factor.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program