FAIR Privacy v2.
FAIR Privacy v2.
Non-Information Information
Collection Information Processing
Surveillance Aggregation
Interrogation Insecurity
Identification
Secondary Use
Invasion Exclusion
Intrusion
Decisional Interference
Information Dissemination
Breach of Confidentiality
Note you can use other sets of Disclosure
(moral) privacy harms.
Exposure
Increased Accessibility
Appropriation
Adverse Tangible Consequences
Subjective Objective
Psychological Lost Opportunity
–Embarrassment –Employment
–Anxiety –Insurance & Benefits
–Suicide –Housing
–Education
Behavioral
–Changed Behavior Economic Loss
–Reclusion –Inconvenience
–Financial Cost
Social Detriment
–Loss of Trust
–Ostracism
Loss of Liberty
–Bodily Injury
–Restriction of Movement
–Incarceration
–Death
EXAMPLE
This example comes from the paper, “Quantitative Privacy Risk,” published in the
proceedings of the 2021 IEEE European Symposium on Privacy and Security. For more
detail and exploration see that paper at
https://doi.org/10.1109/EuroSPW54576.2021.00043
The provided Excel spreadsheet (v2.11) has more details on the calculations used in this
example.
EXAMPLE: Surveillance Risk of Smart Locks
Breakdown by factors
Capability Difficulty
Without impediments, the capability was
The skills and resources The impediments that a
available to threat actors in a threat actor in a given presumed to exceed difficulty leading to 100%
given situation to act. situation must overcome vulnerability for inherent/baseline risk
to act calculation
EXAMPLE: Surveillance Risk of Smart Locks
Breakdown by factors
Vulnerability
The probability that threat actors’
attempts will succeed.
Severity distribution
based on survey to
determine whether
Severity such surveillance
The degree to which an activity exceeded social norms
Harm Magnitude violates social norms of privacy.
of behavior.
. The severity of the harm in
the at risk population and the
tangible consequential Adverse
risks to them. Consequence Risk
The frequency and magnitude of
adverse tangible consequences on
the threatened population.
Privacy Risk
250000
200000
Threat Frequency
Measuring privacy risk by The frequency, given a time
frame, that threat actors
threaten the at-risk
itself doesn’t provide much population.
150000
to action on.
Privacy Risk
Privacy The frequency of privacy
threats and magnitude of
100000
Risk privacy harms for the
at-risk population.
250000
annualized
Harm Magnitude
94,857.90 Minimum
50000 . The severity of the harm in
125,193.83 10th Percentile the at risk population and the
153,936.65 Most Likely 200000 tangible consequential
risks to them.
189,088.86 90th Percentile
233,222.64 Maximum
0
Without
Controls
EXAMPLE: Surveillance Risk of Smart Locks
Comparisons
Threat Frequency
The frequency, given a time
frame, that threat actors
threaten the at-risk
population.
Harm Magnitude
. The severity of the harm in
the at risk population and the
tangible consequential
risks to them.
Resources
• R. J. Cronk and S. S. Shapiro, "Quantitative Privacy Risk Analysis," 2021 IEEE
European Symposium on Security and Privacy Workshops (EuroS&PW), 2021,
pp. 340-350, doi: 10.1109/EuroSPW54576.2021.00043.
• R. Jason Cronk, “Analyzing Privacy Risk Using FAIR” (Jan 14, 2019) FAIR
Institute https://www.fairinstitute.org/blog/analyzing-privacy-risk-using-fair
• FAIR Institute
• R. Jason Cronk, “Why privacy risk analysis must not be harm focused” (Jan 15,
2019) IAPP
https://iapp.org/news/a/why-privacy-risk-analysis-must-not-be-harm-focused/
EXTRA
• Jaap-Henk Hoepman, Privacy Design Strategies, Jan 2019
• Dan Solove, A Taxonomy of Privacy, Jan 2006, UPenn Law Review