Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 306 - Algorithmic and Inferential Advances in Univariate and Multivariate Tuning-Parameter-Free Nonparametric Procedures
Type: Topic Contributed
Date/Time: Wednesday, August 5, 2020 : 10:00 AM to 11:50 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #313094
Title: Multivariate Adaptation in Log-Concave Density Estimation
Author(s): Arlene K. H. Kim* and Richard Samworth and Oliver Feng and Adityanand Guntuboyina
Companies: Korea University and University of Cambridge and University of Cambridge and University of California, Berkeley
Keywords: multivariate adaptation; bracketing entropy; log-concavity; contour separation; maximum likelihood estimation

We study the adaptation properties of the multivariate log-concave maximum likelihood estimator fhat over three subclasses of log-concave densities. The first consists of densities with polyhedral support whose logarithms are piecewise affine. The complexity of such densities f can be measured in terms of the G(f), sum of the numbers of facets of the subdomains in the polyhedral subdivision of the support induced by f. Given n independent observations from a d-dimensional log-concave density with d is 2 or 3, we prove a sharp oracle inequality, which in particular implies that the Kullback--Leibler risk of the fhat for such densities is bounded above by G(f)/n, up to a polylogarithmic factor. For the second type of adaptation, we consider densities that are bounded away from zero on a polytopal support; we show that up to polylogarithmic factors, fhat attains the rate n^{-4/7} when d=3, which is faster than the worst-case rate of n^{-1/2}. Finally, our third type of subclass consists of densities consists of densities whose contours are well-separated. Here, we prove another sharp oracle inequality.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program