Online Program Home
My Program

Abstract Details

Activity Number: 181 - SPEED: Statistical Learning and Data Science Speed Session 1, Part 2
Type: Contributed
Date/Time: Monday, July 29, 2019 : 10:30 AM to 11:15 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #307523
Title: Deep Learning and MARS: a Connection
Author(s): Sophie Langer* and Michael Kohler and Adam Krzyzak
Companies: Technische Universitaet Darmstadt and Technische Universitaet Darmstadt and Concordia University
Keywords: Curse of dimensionality; deep neural networks; MARS; nonparametric regression; rate of convergence ; piecewise partitioning
Abstract:

We consider least squares regression estimates using deep neural networks. We show that these estimates satisfy an oracle inequality, which implies that (up to a logarithmic factor) the error of these estimates is at least as small as the optimal possible error bound which one would expect for MARS in case that this procedure would work in the optimal way. As a result we show that our neural networks are able to achieve a dimensionality reduction in case that the regression function locally has low dimensionality. This assumption seems to be realistic in real-world applications, since selected high-dimensional data are often confined to locally-low-dimensional distributions. In our simulation study we provide numerical experiments to support our theoretical results and to compare our estimate with other conventional nonparametric regression estimates, especially with MARS.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program