Minimum divergence estimators possess the dual property of efficiency and robustness and are being increasingly used in a various scientific investigations. In privacy applications, it is of interest to obtain estimators that are efficient and differentially private. However, the resulting estimators are highly non-robust leading to incorrect inferences. In this presentation, we describe new higher order Edgeworth expansions for minimum divergence estimators and use them to obtain robust, efficient and differentially private estimators. We demonstrate our results using simulating and using real data in healthcare settings. Implications of our results to policy development will also be provided.