Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 498 - Modern Machine Learning
Type: Contributed
Date/Time: Thursday, August 6, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #309800
Title: Deep Learning with Gaussian Differential Privacy
Author(s): Zhiqi Bu*
Companies: University of Pennsylvania
Keywords: Differential privacy; Deep learning; Neural network

Deep learning models are often trained on datasets that contain sensitive information such as individuals' shopping transactions, personal contacts, and medical records. An increasingly important line of work therefore has sought to train neural networks subject to privacy constraints that are specified by differential privacy or its divergence-based relaxations. These privacy definitions, however, have weaknesses in handling certain important primitives (composition and subsampling), thereby giving loose or complicated privacy analyses of training neural networks. In this paper, we consider a recently proposed privacy definition termed f-differential privacy for a re fined privacy analysis of training neural networks. This paper derives analytically tractable expressions for the privacy guarantees of both stochastic gradient descent and Adam used in training deep neural networks. Our results demonstrate that the f-differential privacy framework allows for the improvement on the prior analysis with a better prediction accuracy without violating the privacy budget, via our experiments in a range of tasks in image classification, text classification, and recommender systems.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program