Abstract:
|
Differential privacy (DP), provides a framework for a strong provable privacy protection against arbitrary adversaries while allowing the release of summary statistics and potentially synthetic data. DP mechanisms require the introduction of randomness, which reduces the utility of the results, especially in finite samples. In this talk we describe a general framework, based on sound statistical principles from the measurement error, robustness and the likelihood-based inference, and specific examples of how to achieve optimal statistical inference under formal privacy, given privatized data releases and the parameters of the privacy mechanism, focused on survey and census data products. Talk is based on joint work with J.Awan, V. Karwa, R. Molinari and J. Seeman.
|