Authors:
Georgia Tsaloli
and
Aikaterini Mitrokotsa
Affiliation:
Department of Computer Science and Engineering, Chalmers University of Technology, Gothenburg and Sweden
Keyword(s):
Verifiable Computation, Differential Privacy, Privacy-preservation.
Related
Ontology
Subjects/Areas/Topics:
Applied Cryptography
;
Cryptographic Techniques and Key Management
;
Data and Application Security and Privacy
;
Data Engineering
;
Data Integrity
;
Databases and Data Security
;
Information and Systems Security
;
Privacy
;
Security and Privacy in the Cloud
Abstract:
Often service providers need to outsource computations on sensitive datasets and subsequently publish statistical results over a population of users. In this setting, service providers want guarantees about the correctness of the computations, while individuals want guarantees that their sensitive information will remain private. Encryption mechanisms are not sufficient to avoid any leakage of information, since querying a database about individuals or requesting summary statistics can lead to leakage of information. Differential privacy addresses the paradox of learning nothing about an individual, while learning useful information about a population. Verifiable computation addresses the challenge of proving the correctness of computations. Although verifiable computation and differential privacy are important tools in this context, their interconnection has received limited attention. In this paper, we address the following question: How can we design a protocol that provides both
differential privacy and verifiable computation guarantees for outsourced computations? We formally define the notion of verifiable differentially private computation (VDPC) and what are the minimal requirements needed to achieve VDPC. Furthermore, we propose a protocol that provides verifiable differentially private computation guarantees and discuss its security and privacy properties.
(More)