Tags:attacker advantage, d-privacy and differential privacy
Abstract:
Differential privacy is a privacy technique with provable guarantees which is typically achieved by introducing noise to statistics before releasing them. The level of privacy is characterized by a certain numeric parameter ε > 0, where smaller ε means more privacy. However, there is no common agreement on how small ε should be, and the actual likelihood of data leakage for the same ε may vary for different released statistics and different datasets.
In this paper, we show how to relate ε to the increase in the probability of attacker's success in guessing something about the private data. The attacker's goal is stated as a Boolean expression over guessing particular categorical and numerical attributes, where numeric attributes can be guessed with some precision. The paper is built upon the definition of d-privacy, which is a generalization of ε-differential privacy.
Interpreting Epsilon of Differential Privacy in Terms of Advantage in Guessing or Approximating Sensitive Attributes