Loading presentation...

Present Remotely

Send the link below via email or IM

Copy

Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.

DeleteCancel

Make your likes visible on Facebook?

Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.

No, thanks

Quantifying and Protecting Location Privacy

PhD. Thesis
by

Reza Shokri

on 17 December 2012

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of Quantifying and Protecting Location Privacy

De-anonymization (re-identification) Localization Quantifying and Protecting
Location Privacy Reza Shokri Privacy individual entity 17 December 2012 Privacy of an individual in relation to an entity is inversely proportional to
the amount of her personal information that is exposed to that entity Quantitative Privacy individual entity computation
storage
communication output side channel input system Privacy Enhancing Technologies service Some information leakage is needed to receive a service how much personal information is leaked through a system?
how much service is provided to the user?
how to preserve both privacy and service quality? quantitative information flow analysis, differential privacy, Bayesian analysis of Mix networks Location Privacy Mobile users share their location with Location-based Services (LBSs)
The service provider can track users, and learn about their habits, interests, activities, and relationships Location Privacy background
knowledge Anonymization (pseudonymization)
Replacing actual user-name with a random identity
Location Obfuscation
Hiding location, Adding noise, Reducing precision, ... Existing protection mechanisms: not systematic

Adversary knowledge/attacks: often ignored

A common formal framework: MISSING How to evaluate/compare various protection mechanisms?

How to preserve both location privacy and service quality? Quantitative Location Privacy User Profiling; Adversary Knowledge Location Inference Attacks; Quantifying Location Privacy Optimal Protection Strategy against Inference Attacks privacy and service quality requirements are not properly modeled evaluation is done in ad adhoc manners with different metrics Quantifying Location Privacy Privacy of users: expected estimation error of adversary in his inference attacks LPPM: Location-Privacy Preserving Mechanism Location-Privacy Preserving Mechanism Mobility Location-based Service Tracking De-anonymization Maximum Weight Assignment Forward-Backward Algorithm Likelihood Computation exponential complexity Iterative Forward-Backward Algorithm forward variables likelihood initialization of forward variables Tracking Location-Privacy Meter (LPM) Tool Localization Forward-Backward Algorithm Evaluating Location-Privacy Preserving Mechanisms Evaluating Location-Privacy Metrics Protecting Location Privacy Anonymization is not enough; Location obfuscation is necessary Challenges Service-quality constraints
User-based protection
Real-time protection Pitfalls Ignore adversary knowledge and objective
Assume a suboptimal attack Anticipate location inference attacks
Maximize users’ location privacy against the most effective attack
Respect the users’ service quality constraint Objective Setting LBS: sporadic access (sparse in time)
Adversary: service provider
Attack: localization
Protection: user-centric location obfuscation Viterbi Algorithm Quantitative Location Privacy Conclusion Bayesian Stackelberg Game maximize minimize Leader Follower GAME belief Optimal Obfuscation Basic vs. Optimal Obfuscation nonlinear Optimal Inference nonlinear linear linear Evaluation We consider the adversary model as the inseparable element in analysis and design of privacy preserving mechanisms
We provide theoretical methods plus software tools to analyze and design privacy preserving mechanisms
We consider privacy and service quality requirements of users
distortion functions are generic
Bayesian framework enables us quantifying the information leakage
Game theoretic framework enables us anticipating inference attacks

Potential extensions to the framework:
Location semantics, and users' network
Optimal strategy for continuous location update
Pseudonym change protection mechanism more than 70 unique downloads (from outside EPFL) 2 3 4 5 6 7 8 9 10 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 28 29 30 31 32 33 34 35 36 37 38 39 40 User Profiling:
Constructing Background Knowledge of Adversary Reconstructing complete traces of a user from partial traces, AND Learning her mobility model
Gibbs sampling Jean-Pierre Hubaux George Danezis, Matthias Grossglauser, Jean-Yves Le Boudec, Vitaly Shmatikov Vincent Bindschaedler, George Danezis, Claudia Diaz,
Julien Freudiger, Jean-Pierre Hubaux, Mathias Humbert,
Murtuza Jadliwala, Jean-Yves Le Boudec, Panos Papadimitratos,
Pedram Pedarsani, Marcin Poturalski, Gael Ravot, Maxim Raya,
Francisco Santos, George Theodorakopoulos, Carmela Troncoso Acknowledgment LBS access prob. = 0.1 LBS access prob. = 0.1 Anonymity Location Privacy Location Privacy 1 user
Full transcript