Online Program Home
My Program

Abstract Details

Activity Number: 227
Type: Invited
Date/Time: Monday, August 1, 2016 : 2:00 PM to 3:50 PM
Sponsor: Committee on Privacy and Confidentiality
Abstract #318341 View Presentation
Title: Learning with Differential Privacy: Stability, Learnability, and the Sufficiency and Necessity of ERM Principle
Author(s): Yu-Xiang Wang* and Jing Lei and Stephen E. Fienberg
Companies: Carnegie Mellon University and Carnegie Mellon University and Carnegie Mellon University
Keywords: Differential Privacy ; Learnability ; ERM ; Statistical Learning Theory
Abstract:

While machine learning has proven to be a powerful data-driven solution to many real-life problems, its use in sensitive domains has been limited due to privacy concerns. A popular approach known as **differential privacy** offers provable privacy guarantees, but it is often observed in practice that it could substantially hamper learning accuracy. In this paper we study the learnability (whether a problem can be learned by any algorithm) under Vapnik's general learning setting with differential privacy constraint, and reveal some intricate relationships between privacy, stability and learnability.

In particular, we show that a problem is privately learnable **if an only if** there is a private algorithm that asymptotically minimizes the empirical risk (AERM). In contrast, for non-private learning AERM alone is not sufficient for learnability. In light of this, we propose a conceptual procedure that learns any privately learnable problems. Lastly, we extend part of the results to (?,?)-differential privacy and establish the existence of a phase-transition on the approximate private learnability with respect to how small ? needs to be.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

 
 
Copyright © American Statistical Association