Wednesday, May 23, 2012

Differential Privacy as a Response to the Reidentification Threat: The Facebook Advertiser Case Study

Chin, Andrew andAnne Klinefelter. "Differential Privacy as a Response to the Reidentification Threat: The Facebook Advertiser Case Study (May 8, 2012). North Carolina Law Review, Vol. 90, No. 5, 2012; UNC Legal Studies Research Paper.

From the abstract: "Recent computer science research on the reidentification of individuals from anonymized data has given some observers in the legal community the impression that the utilization of data is incompatible with strong privacy guarantees, leaving few options for balancing privacy and utility in various data-intensive settings. This bleak assessment is incomplete and somewhat misleading, however, because it fails to recognize the promise of technologies that support anonymity under a standard that computer scientists call differential privacy. This standard is met by a database system that behaves similarly whether or not any particular individual is represented in the database, effectively producing anonymity. Although a number of computer scientists agree that these technologies can offer privacy-protecting advantages over traditional approaches such as redaction of personally identifiable information from shared data, the legal community’s critique has focused on the burden that these technologies place on the utility of the data. Empirical evidence, however, suggests that at least one highly successful business, Facebook, has implemented such privacy-preserving technologies in support of anonymity promises while also meeting commercial demands for utility of certain shared data.

This Article uses a reverse-engineering approach to infer that Facebook appears to be using differential privacy-supporting technologies in its interactive query system to report audience reach data to prospective users of its targeted advertising system, without apparent loss of utility." Read more