Online Program Home
  My Program

Abstract Details

Activity Number: 461 - SPEED: Machine Learning
Type: Contributed
Date/Time: Wednesday, August 2, 2017 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #322930 View Presentation
Title: Whiteout: Gaussian Adaptive Regularization Noise in Deep Neural Networks
Author(s): Yinan Li* and Ruoyi Xu and Fang Liu
Companies: University of Notre Dame and University of Science and Technology of China and University of Notre Dame
Keywords: bridge ; elastic net ; regularization ; robustness ; consistency ; backpropagation
Abstract:

Noise injection (NI) is a method to mitigate overfitting in neural networks (NNs). The recent developments in Bernoulli NI as implemented in dropout and shakeout demonstrate the efficiency and feasibility of NI in regularizing NNs. We propose whiteout, a new regularization technique via adaptive Gaussian NI in deep NNs. Whiteout is associated with a closed-form penalized objective function in GLMs that includes bridge, (adaptive) lasso, ridge regression, and elastic net as special cases. Whiteout can also be viewed as robust learning of NN model in the presence of small perturbations in input and hidden nodes. The noise-perturbed empirical loss function with whiteout converges almost surely to the ideal loss function, and the estimates of NN parameters obtained from minimizing the former are consistent with those obtained from minimizing the idea loss function. Whiteout performs better in small training data compared to dropout, and its objective function is more stable compared to shakeout.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association