Privacy has become a significant concern in modern society as personal information about individuals is increasingly collected, used, and shared, often using digital technologies, by a wide range of organizations. One goal of this project is to precisely articulate what privacy means in various settings, and whether and how it can be achieved. In other words, we seek to develop conceptual and technical frameworks in which privacy notions (policies) are given precise semantics, algorithms for enforcing such policies, and characterizations of classes of policies that can or cannot be enforced. In addition to general results of this form, another goal of the project is to study specific application domains that raise significant privacy concerns in modern society and to apply these results (or specialized versions thereof) to these domains. Our current focus is on the healthcare domain. We are also thinking about privacy issues on the Web, in online social media, and court records.
Specifically, to mitigate privacy concerns, organizations are required to respect privacy laws in regulated sectors (e.g., HIPAA in healthcare, GLBA in financial sector) and to adhere to self-declared privacy policies in self-regulated sectors (e.g., privacy policies of companies such as Google and Facebook in Web services). We investigate the possibility of formalizing and enforcing such practical privacy policies using computational techniques. We formalize privacy policies that prescribe and proscribe *flows* of personal information as well as those that place restrictions on the *purposes* for which a governed entity may use personal information. Recognizing that traditional preventive access control and information flow control mechanisms are inadequate for enforcing such privacy policies, we develop principled audit and accountability mechanisms with provable properties that seek to encourage policy-compliant behavior by detecting policy violations, assigning blame and punishing violators. We apply these techniques to several US privacy laws and organizational privacy policies, in particular, producing the first complete logical specification and audit of all disclosure-related clauses of the HIPAA Privacy Rule.
An overview paper:
o A. Datta, Privacy through Accountability: A Computer Science Perspective, in Proceedings of 10th International Conference on Distributed Computing and Internet Technology, February 2014. [Paper] Invited Paper
o M. C. Tschantz, A. Datta, J. M. Wing, Formalizing and Enforcing Purpose Restrictions in Privacy Policies, in Proceedings of 33rd IEEE Symposium on Security and Privacy, May 2012. [ Paper] [Full Version]
o A. Barth, A. Datta, J. C. Mitchell, H. Nissenbaum, Privacy and Contextual Integrity: Framework and Applications, in Proceedings of 27th IEEE Symposium on Security and Privacy , pp. 184-198, May 2006. [ Paper ]
o H. DeYoung, D. Garg, L. Jia, D. Kaynar, A. Datta, Experiences in the Logical Specification of the HIPAA and GLBA Privacy Laws, in Proceedings of 9th ACM Workshop on Privacy in the Electronic Society, October 2010 [ Paper ]. Full version [ Paper ]
Audit and Accountability
o J. Blocki, N. Christin, A. Datta, A. Sinha, Adaptive Regret Minimization in Bounded-Memory Games, in Proceedings of 4th Conference on Decision and Game Theory for Security, November 2013. [Full Version]
o J. Blocki, N. Christin, A. Datta, A. Sinha, Audit Mechanisms for Provable Risk Management and Accountable Data Governance, in Proceedings of 3rd Conference on Decision and Game Theory for Security, November 2012. [Paper]
o D. Garg, L. Jia, A. Datta, Policy Auditing over Incomplete Logs: Theory, Implementation and Applications, in Proceedings of 18th ACM Conference on Computer and Communications Security, October 2011. [Paper] [Full Version]
o J. Blocki, N. Christin, A. Datta, A. Sinha, Regret Minimizing Audits: A Learning-Theoretic Basis for Privacy Protection, in Proceedings of 24th IEEE Computer Security Foundations Symposium, June 2011 [Paper]
o J. Blocki, A. Blum, A. Datta, O. Sheffet, Differentially Private Data Analysis of Social Networks via Restricted Sensitivity, in Proceedings of 4th Innovations in Theoretical Computer Science Conference, January 2013. [Full Version]
o J. Blocki, A. Blum, A. Datta, O. Sheffet, The Johnson-Lindenstrauss Transform Itself Preserves Differential Privacy, in Proceedings of 53rd Annual IEEE Symposium on Foundations of Computer Science, October 2012. [Full Version]
o M. C. Tschantz, D. Kaynar, A. Datta, Formal Verification of Differential Privacy for Interactive Systems, Extended abstract in Proceedings of the 27th Annual Conference on Mathematical Foundations of Programming Semantics, May 2011. Full Version [ Paper ].
o O. Chowdhury, A. Gampe, J. Niu, J. von Ronne, J. Bennatt, A. Datta, L. Jia, W. H. Winsborough, Privacy Promises That Can Be Kept: A Policy Analysis Method with Application to the HIPAA Privacy Rule, in Proceedings of 18th ACM Symposium on Access Control Models and Technologies, June 2013.
o A. Conley, A. Datta, H. Nissenbaum, D. Sharma, Sustaining both Privacy and Open Justice in the Transition from Local to Online Access to Court Records: A Multidisciplinary Inquiry, Maryland Law Review, 71 Md. L. Rev. 772 (2012). [Paper]
(Preliminary version presented at the 2011 Privacy Law Scholars Conference, June 2011.)
o A. Datta, J. Blocki, N. Christin, H. DeYoung, D. Garg, L. Jia, D. Kaynar, A. Sinha, Understanding and Protecting Privacy: Formal Semantics and Principled Audit Mechanisms, 7th International Conference on Information Systems Security, December 2011 [Paper] Invited Paper
o J. Blocki, N. Christin, A. Datta, A. Sinha, Audit Mechanisms for Privacy Protection in Healthcare Environments (Position Paper), in 2nd Usenix Workshop on Health Security and Privacy, August 2011 [Paper]
o A. Datta, N. Dave, J. C. Mitchell, H. Nissenbaum, D. Sharma, Privacy Challenges in Patient-Centric Health Information Systems (Position Paper), in 1st Usenix Workshop on Health Security and Privacy, August 2010 [Paper]