Online Program Home
My Program

Abstract Details

Activity Number: 127 - SPEED: Statistical Learning and Data Science Speed Session 1, Part 1
Type: Contributed
Date/Time: Monday, July 29, 2019 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #304136 Presentation
Title: Activation Adaptation in Neural Networks
Author(s): Vahid Partovi Nia* and Farnoush Farhadi and Andrea Lodi
Companies: Huawei Technologies, Ecole Polytechnique de Montreal and Ericsson and Ecole Polytechnique de Montreal
Keywords: Adaptive Activation Function; Classification; Deep learning; Neural Netowrks
Abstract:

Many neural network architectures rely on the choice of the activation function for each hidden layer. Given the activation function, the neural network is trained over the bias and the weight parameters. The bias catches the center of the activation, and the weights capture the scale. Here we propose to train the network over a shape parameter as well. This view allows each neuron to tune its own activation function and adapt the neuron curvature towards a better prediction. This modification only adds one further equation to the back-propagation for each neuron. Re-formalizing activation functions as CDF generalizes the class of activation function extensively. We aimed at generalizing an extensive class of activation functions to study: i) skewness and ii) smoothness of activation functions. Here we introduce adaptive Gumbel activation function as a bridge between Gumbel and sigmoid. A similar approach is used to invent a smooth version of ReLU. Our comparison with common activation functions suggests different data representation especially in early neural network layers. This adaptation also provides prediction improvement.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program