Two open source packages to enable Privacy and Security
Describe Differential Privacy and how Azure uses it
Azure implements differential privacy by randomizing data and adding noise to conceal PII from data scientists
Explain what Epsilon ∈ is in the Differential Privacy equation and how affects the result
The common form of Differential Privacy is ∈-Differential Privacy. The ∈ value is a non-negative value that measures the privacy loss in the data.
Lower ∈ Values: More Privacy (i.e. more Noise added) / Less Accuracy
Higher ∈ Values: Less Privacy (i.e. less Noise added) / More Accuracy
Accuracy refers to the accuracy of your Model, which gets trained with higher degrees of “noise” when Epsilon is lower.
PB NAI PATo MQ
How Epsilon ∈ is used in Differential Privacy
Accuracy refers to the accuracy of your Model. The more noise added, the less accurate your Model will be since it’s training on “noisy” data.
Explain what your Privacy Budget represents and how you use it up
Privacy Budget is the amount of privacy loss that is “acceptable”. When you run multiple queries against a data set, total privacy loss accumulates. The accumulation is measured by summing up the Epsilon value per query.
Ta At BA RA
How Counterfit is used
Example code:
import counterfit import counterfit.targets as targets target = targets.CreditFraud() target.load() attack_name = 'hop_skip_jump' new_attack = counterfit.Counterfit.build_attack(target, attack_name) results = counterfit.Counterfit.run_attack(new_attack)
Te Im Ta
The three supported Target Types in Counterfit
Text, Image and Tabular
Attacks can be configured to work against any Target, though they have default compatible types (T/F)
False. An Attack is compatible with specifc Target types. They cannot work on just any type.