Technology in Government (TIG) | Topics in Privacy (TIP)

Technology in Government (TIG) and Topics in Privacy (TIP) consist of weekly discussions and brainstorming sessions on all aspects of privacy (TIP) and uses of technology to assess and solve societal, political, and government problems (TIG). Discussions are often inspired by a real-world problems being faced by the lead discussant, who may be from industry, government, or academia. Practice talks and presentations on specific techniques and topics are also common.

Unless otherwise noted, refreshments are served at 2:30p, discussion 3 to 4pm in room K354, at 1737 Cambridge Street, Cambridge. Topics are usually not posted earlier than the week or two before.

Schedule Spring 2014

Date  Discussant  Topic
2/10  Peter Suber, Berkman Center  Brainstorming Open Access with the Director of the Harvard Office for Scholarly Communication
  
2/24  Arek Stopczynski, Technical University of Denmark  Privacy Implications of Measuring Large-Scale Social Networks With High Resolution (PDF)
  
3/3  Scott Howe, Acxiom 2:30 start time  Progress and Privacy: Bitter Enemies or Strong Allies?
  
3/10  Arnon Rosenthal, Mitre  Supporting Patient Consents: A Computer Science Researcher's Experiences In A Complex World
  
3/28  Special Seminar at the Harvard Faculty Club 8:30am - 3:30pm  Governance Issues for Private Data Stores
  
3/31  Deborah Peel, Patient Privacy Rights 2:30 start time  How Your Medical Data Is Shockingly Vulnerable to Privacy Invasions
  
4/7  Ilaria Liccardi, MIT  Investigating privacy issues related to mobile devices. Are current transparency mechanisms working?
  
4/28  David Abrams, Suffolk Law  Warrantless Searches of Personal Electronic Devices: Is the Baby Lost in the Woods?
  
5/5  Ben Shiller, Brandeis   First-Degree Price Discrimination Using Big Data
  
5/12  Adam Tanner, Harvard; John Acres, CEO Acres 4.0  How Casinos Use Your Personal Data to Keep You Coming Back for More
  
5/19  Oshani Seneviratne  Healthcare Privacy and Intellectual Property Rights Protection with Accountable Systems
  

Abstracts of Talks and Discussions

  1. Brainstorming Open Access with the Director of the Harvard Office for Scholarly Communication

    Peter Suber will lead a brainstorming session about Open Access.

    Peter Suber is a Faculty Fellow at the Berkman Center for Internet and Society as well the Director of the Harvard Office for Scholarly Communication and Director of the Harvard Open Access Project. His latest book is Open Access (MIT Press, 2012). It's available in paperback and at least eight OA editions. His complete bio is can be found at https://bit.ly/petersuber.


  2. Privacy Implications of Measuring Large-Scale Social Networks With High Resolution

    Our capabilities of sensing complex social systems with high resolution have been growing at an unprecedented rate. In the Copenhagen Networks Study, the data, including face-to-face interactions, telecommunication, social networks, location, and background information (personality, demographic, health, politics), is collected from a densely connected population of 1,000 individuals, using state-of-art smartphones as social sensors. Due to the extreme richness of the collected data, high temporal resolution, longitudinal character, and size of the population, protecting the privacy of the participants poses some new and exciting challenges. The talk will include an overview of the privacy challenges the authors of the study encountered - from the technical and operational perspective - as well as implemented solutions.

    Arkadiusz (Arek) Stopczynski is a PhD student at Technical University of Denmark and Visiting Student at MIT Media Lab, Human Dynamics group. Arek's work focuses on building solutions for lage-scale sensing of human systems, including behavioral, psychological, and neurological data, while exploring the privacy practices of data handling, from business, legal, and technical perspectives. Arek is a co-author of a chapter "The New Deal on Data: A Framework for Institutional Controls" in the upcoming book "Privacy, Big Data, and the Public Good: Frameworks for Engagement".


  3. Progress and Privacy: Bitter Enemies or Strong Allies?

    Progressive marketers cry, "We need data!" Privacy advocates challenge, "Stop the sharing!" But is there another view - where consumer data can be used for compelling marketing programs, without compromising personal privacy? Join Scott Howe, Acxiom CEO and President, as he discusses how progress and privacy can, and should, go hand in hand. You'll learn how Acxiom, and companies like it, gather and use consumer data. He will also give an update on Acxiom's https://www.aboutthedata.com - the first consumer portal for viewing and managing personal marketing data. Finally, he will offer his frank opinions on recent Congressional investigations of data brokers - and the potential for future government regulation.

    As CEO and president of Acxiom, Scott drives a strong, results-oriented culture for Acxiom's approximately 6,200 associates as the company deepens and expands its offerings of global marketing and technology products. Delivering on his vision, Acxiom is fundamentally changing the business of marketing by delivering 1:1 cross-channel marketing at scale. Delivering on his vision, Acxiom is fundamentally changing the business of marketing by delivering 1:1 cross-channel marketing at scale.

    Scott is a magna cum laude graduate of Princeton University, with a degree in economics, and he earned an MBA from Harvard University. He serves on the board of Blue Nile - a leading online retailer of diamonds and fine jewelry, as well as the Center for Medical Weight Loss. Formerly, he was a director of the Internet Advertising Bureau (IAB) and Turn, Inc., a digital advertising company.


  4. Supporting Patient Consents: A Computer Science Researcher's Experiences In A Complex World

    To protect patient privacy, and avoid liability, data holders may disclose patient records only in accordance with patient consent or government rules. One desires granular control over topics (e.g., HIV), data categories (e.g., medications), recipients (mental health professionals treating me), and also over technical protections (require two phase authentication?). Current initiatives emphasize compliance with applicable laws, so data holders will be willing to share. They do little to be appropriate for each patient's needs to balance sharing and protection. For example, we demonstrate that a CS PhD is insufficient education for wisely managing "checkbox" preferences, and show how legal "protections" actually may cause patient preferences to be less appropriate. We sketch a radical alternative that generates detailed policies from a brief description of patient concerns, dropping the assumptions of perfect appropriateness of the policy and perfect enforcement. We then invite discussion of aspects that require legal/regulatory adjustments, and of other data protection arenas where such an approach might be suitable. The opinions represent only the speaker. The work was done in collaboration with Peter Mork, Jean Stanford, Anne Kling, Gail Hamilton, Linda Koontz, Marc Hadley, and others. It was funded by the MITRE Corporation and the U.S. Substance Abuse and Mental Health Services Agency.

    Arnon Rosenthal led MITRE's Kairon Consents project, and consults and publishes in the areas of data sharing, databases, cloud migration, and policy based systems. Previous employers included Computer Corporation of America, Sperry Research, the faculty of University of Michigan (Ann Arbor), and sabbaticals at IBM Almaden Research and ETH Zurich. He holds a PhD (EE-CS, 1975) from U. Calif. Berkeley.


  5. Governance Issues for Private Data Stores. Cohosted with the Web Science Trust

    Harvard Faculty Club March 28th 8:30am - 3:30pm Please RVSP by March 24th by email to john AT taysom.com

    Convener : John Taysom, 2012 Senior Fellow, Advanced Leadership Initiative, and the Web ScienceTrust. Hosted by Harvard Topics in Privacy seminar series.

    Objective : To develop a policy recommendation for appropriate governance for private data stores.

    Synopsis : Privacy issues dominated the news in 2013 in US, UK and Europe. Governments were seen to have built considerable stores of private data on citizens. Private companies, often without apparently sustainable business models, were seen to be using data collected from users of on-line services in ways that concerned many. Individuals were shown to be identifiable, even when supposedly anonymised, from Big Data sets complied from a broad set of data other than just the legally defined set of 'Personally Identifiable Information". On-line services can be seen to be rebuilding 'walled gardens' around the ability to track users from device to device and hence to maximise their advertising revenues often at the expense of independent services on the wider web. None of these are desirable developments. And yet free web content, especially news, is arguably a public good. The problem is arguably a technical one. For more information and registration information, download the overview here. (PDF)


  6. How Your Medical Data Is Shockingly Vulnerable to Privacy Invasions

    A Freudian psychoanalyst for the past 35 years, Deborah Peel has learned that a patient’s deepest secrets – his or her medical data – is not locked up in a deep vault safe from prying eyes. Rather ,she says, such information is spread across many layers or providers and insurers, leaving especially weak and potentially vulnerabilities in the ever expanding world of personal data in the Internet age. Worried about the risk of discrimination, she founded Patient Privacy Rights in 2004 and has become the nation’s leading voice on the issue.

    Dr. Peel founded Patient Privacy Rights (PPR), the world's leading consumer health privacy advocacy organization, with 20,000 members in all 50 states. She leads the bipartisan Coalition for Patient Privacy, representing 10.3 million Americans who want to control personal health data in electronic systems. She led the development of PPR’s Trust Framework, 75+ auditable criteria to measure whether technology protects privacy. She also created the influential International Summits on the Future of Health Privacy. Dr. Peel was named one of the “100 Most influential in Healthcare” by ModernHealthcare in 2007, 2008, 2009, and 2011, the only privacy expert ever listed.


  7. Investigating privacy issues related to mobile devices. Are current transparency mechanisms working?

    Many smartphone apps collect personal information used for a variety of purposes - such as advertising for revenue, with personal information used to provide relevant targeting and discover market trends. Our personal information, habits, likes and dislikes can be all deduced from our mobile devices. Recent news reports have documented how the NSA has used leaky apps to spy on users' profiles. Safeguarding mobile privacy is therefore of great concern. In order to understand the dynamics of information collection in mobile apps and to demonstrate the value of transparent access to the details of their access permissions, we gathered information about a large percentage of apps on Google Play, and analyzed the permissions requested by each app. We developed a quantitative measure of the risk posed by apps by devising a sensitivity score to represent the number of occurrences of permissions that read personal information about users where network communication is possible. We found that 46% of apps in our collected dataset have varying level of access to users' personal data and only 6.6.% have declared a "privacy policy" within the app page.

    Users are often unaware of this kind of access even though they grant the required permissions upon app installation. We identify three possible reasons why users have problems choosing applications less likely to disclose their personal information. We have developed a new interface for presenting permissions that grants access to personal data when selecting an app. Using the sensitivity score, a quantitative measure of an app's ability to disclose personal information, our interface highlights relevant permissions and focuses users on the permissions that matter. We validated the effectiveness of this approach with a study of 125 Android smartphone users and found that our improved permission interface led to participants choosing apps with less access to their personal data.

    Ilaria Liccardi is a Marie Curie Postdoctoral Fellow working in the Decentralized Information Group at the Computer Science and Artificial Intelligence Lab (MIT) with Prof. Hal Abelson, Daniel J. Weitzner and Joseph Pato. She investigates how users understand and perceive transparency mechanisms in mobile and/or web applications. She creates and devises tools and techniques to help users be better aware of possible dangers associated with access to and sharing of their personal information. She believes that helping users make clear and informed choices will help them to value their privacy and choose apps or sites less likely to access and/or share their personal information.



  8. Warrantless Searches of Personal Electronic Devices: Is the Baby Lost in the Woods?

    Technology moves forward in leaps and bounds but the law advances in baby steps. Long-standing legal precedent allows a police officer to search a suspect without a warrant after an arrest to ensure the officer's safety and prevent the destruction of evidence. Over time, courts have incrementally expanded this search to allow opening a cigarette pack, looking through a paper address book or examining the contents of a beeper. Recently, police have begun examined the call history and address lists from cell phones of suspects. With modern smart phones holding tens of gigabytes of data, extension of this policy allows police to view email, text messages, documents and images accumulated over months or years; personal information that traditionally would be protected from casual search by the Fourth Amendment's warrant requirement. Has technology created an end-run around the Constitution? The Supreme Court of the United States has agreed to decide the issue this summer. How will they rule? I discuss the history of the search-incident-to-an-arrest exception to the warrant requirement and examine how technological advances clash with the slow, incremental application of precedent to the law.

    David Abrams teaches introductory circuit design at Harvard and Problem Solving and Internet Law at Suffolk Law School. He is interested in the intersection of law and technology as well as privacy and copyright issues on the Internet. David graduated from M.I.T. with degrees in Electrical Engineering and he earned his J.D. at Harvard Law School.


  9. First-Degree Price Discrimination Using Big Data

    Person-specific pricing had until recently rarely been observed. The reason, that reservation values were unobtainable, may no longer be true now that massive datasets tracking detailed individual behavior exist. Hence, a fundamental change in the way goods are priced may be underway. I first explain and discuss the potentially massive changes that are underway, and then investigate this claim in one context. I show demographics, which in the past could be used to personalize prices, poorly predict which consumers subscribe to Netflix. By contrast, modern web-browsing data, with variables such as visits to Amazon.com and internet use on Tuesdays - variables which reflect behavior - do substantially better. I then present a model to estimate demand and simulate outcomes had personalized pricing been implemented. Simulations show using demographics alone to tailor prices raises profits by 0.8%. Including nearly 5000 potential website browsing explanatory variables increases profits by much more, 12.2%, increasingly the appeal of tailored pricing, and resulting in some consumers paying double the price others do for the exact same product.

    Benjamin Shiller is an Assistant Professor of Economics at Brandeis University. Previously, he was a visiting fellow at the National Bureau of Economic Research, as part of the Economics of Digitization and Copyright Initiative. His research has focused on the economic impact of digitization and the internet. Specifically, he has analyzed the impact on optimal pricing, supplier coordination, and resale. His research has been featured in the press, in notable publications such as The Economist, Forbes Magazine, The Washington Post, The American Prospect, and VOX EU.


  10. How Casinos Use Your Personal Data to Keep You Coming Back for More

    Starting in the late 1990s, a former Harvard Business School professor became the driving force behind the casino company Caesars, making it a widely admired engine of data collection. Boosted by vast banks of computers, Caesars today knows the names of the vast majority of their clients, exactly what they spend, where they like to spend it, how often they come, and many other characteristics. How do they gather this data and how do they use it?

    The first half of the talk previews findings from Adam Tanner's upcoming book "What Stays in Vegas: The World of Personal Data –Lifeblood of Big Business –and the End of Privacy as We Know It." In the second half, legendary slot machine developer John Acres will outline how the casino games of the future will need to incorporate ever more personal information to make them compelling to people who grew up in the Internet era.

    Adam Tanner is a fellow at Harvard's Institute for Quantitative Social Science. Details about his book are at WhatStaysinVegas.US

    John Acres is a Las Vegas inventor and entrepreneur who has transformed modern day casino games. He created the first system that allowed casinos to track gamblers on slot machines; devised modern progressive jackpots in which prizes rise over time; and a system of instant bonuses, all features that are a staple of modern games. He is CEO of Acres 4.0, developing the next generation of casino games.


  11. Healthcare Privacy and Intellectual Property Rights Protection with Accountable Systems

    I will describe an infrastructure that enables data transparency and accountability on appropriate usages of data. This infrastructure is encapsulated in a web protocol called HTTPA (HTTP with Accountability) and powered by the decentralized Provenance Tracking Network (PTN). HTTPA enables data consumers and data producers to agree to specific usage restrictions, while the PTN is used to preserve the provenance of data transferred from one entity to another. Using this infrastructure, the data subject can derive an audit trail for a data item and determine if there has been any usage restriction violations. We have evaluated the protocol with two reference accountable systems implementations: (1) Transparent Health: an electronic healthcare records system for patients to mark data items as sensitive and determine the access and usage of those data items, and (2) PhotoRM: a decentralized photo sharing and editing application that allows a content creator to see how her content has been reused on the Web.

    Oshani is a PhD candidate at MIT CSAIL advised by Tim Berners-Lee. Her research is on social systems on the Web augmented with provenance, policy expressions and Linked Data. She is also working on using MIT App Inventor to make disaster management applications. For more information, please see: https://people.csail.mit.edu/oshani/




Prior Sessions

Fall 2013 Spring 2013 | Fall 2012 | Spring 2012 | Fall 2011



Copyright © 2012-2014. President and Fellows Harvard University.   |   IQSS   |    Data Privacy Lab