Online Program Home
My Program

Abstract Details

Activity Number: 236 - SLDS Student Paper Awards
Type: Topic Contributed
Date/Time: Monday, July 30, 2018 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #327085
Title: Sparse-Input Neural Networks for High-Dimensional Nonparametric Regression and Classification
Author(s): Jean Feng* and Noah Simon
Companies: and University of Washington
Keywords: Neural networks; Nonparametric; High-dimensional; Regularization; Sparsity

Neural networks are usually not the tool of choice for nonparametric high-dimensional problems. Though neural nets can approximate complex multivariate functions, they generally require a lot of data to perform well. We show that neural networks are effective in high-dimensional settings if the true function falls in a low-dimensional subspace and proper regularization is used. We propose fitting a neural network with a sparse group lasso penalty on the first-layer weights to encourage models that employ a small number of features. We characterize the statistical convergence of the penalized empirical risk minimizer: we show that the excess risk of this estimator only grows with the logarithm of the number of input features and the weights of irrelevant features converge to zero. Via simulations and data analyses, we show that these sparse-input neural networks outperform existing nonparametric high-dimensional estimation methods when the data has complex higher-order interactions.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program