Online Discussion Meeting: Gaussian Differential Privacy

Online Discussion Meeting: Gaussian Differential Privacy

Date: Wednesday 16 December 2020, 5.00PM
Location: Online
Section Group Meeting


Share this event

The Discussion paper ‘Gaussian Differential Privacy’ by Jinshuo Dong, Aaron Roth, and Weijie J. Su will be presented by authors Weijie J Su and Jinshuo Dong of the University of Pennsylvania, Philadelphia, USA.

Everyone welcome. 

The preprint for the paper is available below and we welcome your contributions in the usual way during the meeting and/or in writing afterwards by 1 January 2021.

Download the preprint (PDF)

Read more about our Discussion meetings.

There is an introductory pre-meeting (DeMO) of the paper with presenter Jinshuo Dong beforehand, at 3:30-4:30pm.

Please register separately for the DeMO and the Discussion meeting that follows.

 
‘Gaussian Differential Privacy’

Abstract:

In the past decade, differential privacy has seen remarkable success as a rigorous and practical formalization of data privacy. This privacy definition and its divergence based relaxations, however, have several acknowledged weaknesses, either in handling composition of private algorithms or in analyzing important primitives like privacy amplification by subsampling. Inspired by the hypothesis testing formulation of privacy, this paper proposes a new relaxation of differential privacy, which we term “f-differential privacy” (f-DP). This notion of privacy has a number of appealing properties and, in particular, avoids difficulties associated with divergence based relaxations. First, f-DP faithfully preserves the hypothesis testing interpretation of differential privacy, thereby making the privacy guarantees easily interpretable. In addition, f-DP allows for lossless reasoning about composition in an algebraic fashion. Moreover, we provide a powerful technique to import existing results proven for the original differential privacy definition to f-DP and, as an application of this technique, obtain a simple and easy-to-interpret theorem of privacy amplification by subsampling for f-DP. In addition to the above findings, we introduce a canonical single-parameter family of privacy notions within the f-DP class that is referred to as “Gaussian differential privacy” (GDP), defined based on hypothesis testing of two shifted Gaussian distributions. GDP is the focal privacy definition among the family of f-DP guarantees due to a central limit theorem for differential privacy that we prove. More precisely, the privacy guarantees of any hypothesis testing based definition of privacy (including the original differential privacy definition) converges to GDP in the limit under composition. We also prove a Berry–Esseen style version of the central limit theorem, which gives a computationally inexpensive tool for tractably analyzing the exact composition of private algorithms. Taken together, this collection of attractive properties render f-DP a mathematically coherent, analytically tractable, and versatile framework for private data analysis. Finally, we demonstrate the use of the tools we develop by giving an improved analysis of the privacy guarantees of noisy stochastic gradient descent.
 
Jinshuo Dong, Aaron Roth and Weijie J. Su -  University of Pennsylvania, Philadelphia, USA
 
RSS Research Section Committee
Judith Shorten journal@rss.org.uk