Privacy-Preserving Surveillance

Privacy-Preserving Surveillance Using Selective Revelation

by Latanya Sweeney


Following the events of September 11, 2001, many in the American public falsely believe they must choose between safety and privacy. This paper proposes an approach to technology (termed "Selective Revelation") that allows data to be shared for surveillance purposes such that shared data have provable assurances of privacy protection while remaining practically useful. Data are provided to a surveillance system with a sliding scale of identifiability, where the level of anonymity matches scientific and evidentiary need. During normal operation, surveillance is conducted on sufficiently anonymous data that is provably useful. When sufficient and necessary scientific evidence merits, the system drills down increasingly more identifiable data. This is a computational model of the "probable cause predicate" performed in American jurisprudence. Under Selective Revelation, human judges, who make decisions as to whether information will be shared with law-enforcement, are replaced with technology that makes these decisions for broader surveillance purposes.

The joined scales match the identifiability of the data (left) to the operational status of the algorithm used in the investigation (right). Under normal operation, sufficiently anonymous data is used. As suspicious behavior is detected, the investigation status lowers, releasing more identifiable data. Click on the image above for improved viewing of the selective revelation scale.

Keywords: homeland security, privacy- preserving surveillance, immutable audit

L. Sweeney. Privacy-Preserving Surveillance Using Selective Revelation. IEEE Intelligent Systems Sept-Oct 2005. (PDF)].
[Earlier version: Carnegie Mellon University, LIDAP Working Paper 15, February 2005. (PDF)].


Related Publications

  • "Privacy Technologies for Homeland Security", Testimony before the Privacy and Integrity Advisory Committee of the Department of Homeland Security ("DHS"), Boston, MA, June 15, 2005. (Testimony and Appendices)

  • L. Sweeney. Privacy-Enhanced Linking. ACM SIGKDD Explorations 7(2) December 2005. (PDF).

  • L. Sweeney. AI Technologies to Defeat Identity Theft Vulnerabilities. AAAI Spring Symposium, AI Technologies for Homeland Security, 2005. (PDF).

  • L. Sweeney and R. Gross. Mining Images in Publicly-Available Cameras for Homeland Security. AAAI Spring Symposium, AI Technologies for Homeland Security, 2005. (PDF).

  • L. Sweeney. Privacy-Preserving Bio-terrorism Surveillance. AAAI Spring Symposium, AI Technologies for Homeland Security, 2005. ( Poster).

  • L. Sweeney. Towards a Privacy-Preserving Watchlist Solution. AAAI Spring Symposium, AI Technologies for Homeland Security, 2005. ( Poster).

  • E. Newton, L. Sweeney, and B. Malin. Preserving Privacy by De-identifying Facial Images. IEEE Transactions on Knowledge and Data Engineering, IEEE TKDE, February 2005. Earlier version available as: E. Newton, L. Sweeney, and B. Malin Preserving Privacy by De-identifying Facial Images. Carnegie Mellon University, School of Computer Science, Technical Report, CMU-CS-03-119. Pittsburgh: 2003. (26 pages in PDF).

In the News

  • CBS News, Associated Press, March 15, 2004, "Privacy Safeguards Quietly Killed". (text)
  • CBS News, Associated Press, November 4, 2002, "Germ Patrol: Like Never Before". (text)

Related Links

Copyright © 2011. President and Fellows Harvard University.   |   IQSS   |    Data Privacy Lab   |    []