Online Program Home
My Program

Abstract Details

Activity Number:
Register
218740 - An Introduction to Differential Privacy (ADDED FEE)
Type: Professional Development
Date/Time: Tuesday, July 30, 2019 : 8:00 AM to 12:00 PM
Sponsor: ASA
Abstract #308049
Title: An Introduction to Differential Privacy (ADDED FEE)
Author(s): Robert Ashmead* and William Sexton* and Philip Leclerc*
Companies: Ohio Colleges of Medicine Government Resource Center and U.S. Census Bureau and US Census Bureau
Keywords:
Abstract:

Differential privacy is a relatively unfamiliar concept to many statisticians, but its use is growing in prominence due to adoption in both industry and government for collecting or publishing data while limiting privacy-loss/disclosure risk. For example, the U.S. Census Bureau is planning to utilize differentially private methods for the release of data products from the 2020 Census. Differential privacy is a mathematical property of noise-infusion disclosure-limitation systems which, if respected, allows quantification of the global increased risk to a person’s privacy due to the use of their data from all publications originating from a confidential source like the 2020 Census. Unlike historically prominent disclosure limitation systems, differentially private algorithms do not rely on keeping their methods, code or parameters secret for their privacy assurances, and so allow for the statistical community to rigorously incorporate the impact of differentially private noise into statistical inference.

The goal of this class is to introduce participants to the motivations, basic principles, interpretations, and analysis of differential privacy and differentially private methods. The learning objectives are that participants are able to 1) understand the differences between differentially private and legacy methods; 2) explain and interpret differential privacy; 3) apply basic differentially private mechanisms to data; and 4) analyze data to which some common differentially private methods have been applied. Please see below for a proposed outline of the course.

We assume no prior knowledge of differential privacy or disclosure limitation methods in general. Some background in mathematical statistics including Bayesian statistics will be helpful in order to understand the theory and interpretation of differential privacy. We will utilize the R programming language to illustrate examples throughout the class, so it will be helpful if participants have at least a basic understanding of R or a similar language and, the day of the course, bring a laptop with a recent version of R (>= version 3.1.2) installed on it.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program