Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 262 - Emerging Statistical Theory and Methods in Deep Learning
Type: Invited
Date/Time: Wednesday, August 11, 2021 : 1:30 PM to 3:20 PM
Sponsor: Section for Statistical Programmers and Analysts
Abstract #316967
Title: Probabilistic Connection Importance Inference and Lossless Compression of Deep Neural Networks
Author(s): Xin (Shayne) Xing*
Companies: Virginia Tech
Keywords:
Abstract:

Deep neural networks (DNNs) can be huge in size, requiring a considerable a mount of energy and computational resources to operate, which limits their applications in numerous scenarios. It is thus of interest to compress DNNs while maintaining their performance levels. We will present a probabilistic importance inference approach for pruning DNNs. Specifically, we test the significance of the relevance of a connection in a DNN to the DNN’s outputs using a nonparametric scoring test and keep only those significant ones. Experimental results show that the proposed approach achieves better lossless compression rates than existing techniques


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program