Activity Number:
|
187
- Contributed Poster Presentations: Section on Nonparametric Statistics
|
Type:
|
Contributed
|
Date/Time:
|
Monday, July 29, 2019 : 10:30 AM to 12:20 PM
|
Sponsor:
|
Section on Nonparametric Statistics
|
Abstract #305090
|
|
Title:
|
On the Rate of Convergence of a Neural Network Regression Estimate Learned by Gradient Descent
|
Author(s):
|
Alina Braun* and Michael Kohler and Harro Walk
|
Companies:
|
Technische Universität Darmstadt and Technische Universitaet Darmstadt and Universität Stuttgart
|
Keywords:
|
gradient descent;
neural networks;
nonparametric regression;
rate of convergence;
projection pursuit
|
Abstract:
|
Nonparametric regression with random design is considered. Estimates are defined by minimzing a penalized empirical L_2 risk over a suitably chosen class of neural networks with one hidden layer via gradient descent. Here, the gradient descent procedure is repeated several times with randomly chosen starting values for the weights, and from the list of constructed estimates the one with the minimal empirical L_2 risk is chosen. Under the assumption that the number of randomly chosen starting values and the number of steps for gradient descent are sufficiently large it is shown that the resulting estimate achieves (up to a logarithmic factor) the optimal rate of convergence in a projection pursuit model. The final sample size performance of the estimates is illustrated by using simulated data.
|
Authors who are presenting talks have a * after their name.