Online Program Home
My Program

Abstract Details

Activity Number: 659 - Recent Advances in Dimension Reduction and Clustering
Type: Contributed
Date/Time: Thursday, August 1, 2019 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #304576 Presentation
Title: Efficient Local Kernel Estimation Using Structured Random Forests
Author(s): Joshua Loyal* and Ruoqing Zhu and Xin Zhang and Yifan Cui
Companies: University of Illinois Urbana-Champaign and University of Illinois Urbana-Champaign and Florida State University and University of Pennsylvania
Keywords: Random Forests; Smoothing and Nonparametric Regression; Sufficient Dimension Reduction; Kernel Methods
Abstract:

We introduce a new random forest framework that has the potential to utilize an indexed low-dimensional structure to improve statistical efficiency. The innovations are three-fold. First, we introduce a new forest-based method, dimension reducing random forests, that adaptively determines the optimal linear combination split at each internal node by utilizing sufficient dimension reduction techniques. Unlike existing approaches, our method maintains computational efficiency without making restrictive assumptions about the global or local structure of the regression function. Second, viewing random forests as adaptive kernel generators we are able to learn a dimension reduced kernel that has the potential to improve the rate of convergence. Last, we introduce a new forest-based localized sliced inverse regression method that can capture the dimension reduction subspace as effectively as other sufficient dimension approaches. To illustrate the advantages of our method, we conduct extensive experiments on both synthetic and real datasets.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program