This is an old revision of this page, as edited by JCW-CleanerBot(talk | contribs) at 05:04, 18 April 2019(task, replaced: Foundations and Trends® → Foundations and Trends). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.Revision as of 05:04, 18 April 2019 by JCW-CleanerBot(talk | contribs)(task, replaced: Foundations and Trends® → Foundations and Trends)
Adding controlled noise from predetermined distributions is a way of designing differentially private mechanisms. This technique is useful for designing private mechanisms for real-valued functions on sensitive data. Some commonly used distributions for adding noise include Laplace and Gaussian distributions.
Definitions
Let be a collection of all datasets and be a real-valued function. The sensitivity[1] of a function, denoted , is defined by
where the maximum is over all pairs of datasets and in differing in at most one element. For functions with higher dimensions, the sensitivity is usually measured under or norms.
Throughout this article, is used to denote a randomized algorithm that releases a sensitive function under the - (or -) differential privacy.
Laplace mechanism offering .5-differential privacy for a function with sensitivity 1.
where is the expectation of the Laplace distribution and is the scale parameter. Roughly speaking, a small-scale noise should suffice for a weak privacy constraint (corresponding to a large value of ), while a greater level of noise would provide a greater degree of uncertainty in what was the original input (corresponding to a small value of ).
To argue that the mechanism satisfies -differential privacy, it suffices to show that the output distribution of is close in a multiplicative sense to everywhere.The first inequality follows from the triangle inequality and the second from the sensitivity bound. A similar argument gives a lower bound of .
Gaussian Mechanism
Analogous to Laplace mechanism, Gaussian mechanism adds noise drawn from a Gaussian distribution whose variance is calibrated according to the sensitivity and privacy parameters.
Gaussian mechanism.
Note that, unlike Laplace mechanism, only satisfies -differential privacy. To prove so, it is sufficient to show that, with probability at least , the distribution of is close to . The proof is a little more involved (see Appendix A in Dwork and Roth[2]).
Mechanisms for High Dimensional Functions
For high dimensional functions of the form , where , the sensitivity of is measured under or norms. The equivalent Gaussian mechanism that satisfies -differential privacy for such function is
where represents the sensitivity of under norm and represents a -dimensional vector, where each coordinate is a noise sampled according to independent of the other coordinates (see Dwork and Roth[2] for proof).
References
^ abDwork, Cynthia; McSherry, Frank; Nissim, Kobbi; Smith, Adam (2006). "Calibrating Noise to Sensitivity in Private Data Analysis". Theory of Cryptography: 265–284. doi:10.1007/11681878_14.