Schabhüser, Lucas (2019)
Information-Theoretic Privacy in Verifiable Outsourced Computation.
Technische Universität Darmstadt
Dissertation, Erstveröffentlichung
Kurzbeschreibung (Abstract)
Today, it is common practice to outsource time-consuming computations to the cloud. Using the cloud allows anyone to process large quantities of data without having to invest in the necessary hardware, significantly lowering cost requirements. In this thesis we will consider the following work flow for outsourced computations: A data owner uploads data to a server. The server then computes some function on the data and sends the result to a third entity, which we call verifier.
In this scenario, two fundamental security challenges arise. A malicious server may not perform the computation correctly, leading to an incorrect result. Verifiability allows for the detection of such results. In order for this to be practical, the verification procedure needs to be efficient. The other major challenge is privacy. If sensitive data, for example medical data is processed it is important to prevent unauthorized access to such sensitive information. Particularly sensitive data has to be kept confidential even in the long term.
The field of verifiable computing provides solutions for the first challenge. In this scenario, the verifier can check that the result that was given was computed correctly. However, simultaneously addressing privacy leads to new challenges. In the scenario of outsourced computation, privacy comes in different flavors. One is privacy with respect to the server, where the goal is to prevent the server from learning about the data processed. The other is privacy with respect to the verifier. Without using verifiable computation the verifier obviously has less information about the original data than the data owner - it only knows the output of the computation but not the input to the computation. If this third party verifier however, is given additional cryptographic data to verify the result of the computation, it might use this additional information to learn information about the inputs. To prevent that a different privacy property we call privacy with respect to the verifier is required. Finally, particularly sensitive data has to be kept confidential even in the long term, when computational privacy is not suitable any more. Thus, information-theoretic measures are required. These measures offer protection even against computationally unbounded adversaries.
Two well-known approaches to these challenges are homomorphic commitments and homomorphic authenticators. Homomorphic commitments can provide even information-theoretic privacy, thus addressing long-term security, but verification is computationally expensive. Homomorphic authenticators on the other hand can provide efficient verification, but do not provide information-theoretic privacy.
This thesis provides solutions to these research challenges -- efficient verifiability, input-output privacy and in particular information-theoretic privacy.
We introduce a new classification for privacy properties in verifiable computing. We propose function-dependent commitment, a novel framework which combines the advantages of homomorphic commitments and authenticators with respect to verifiability and privacy. We present several novel homomorphic signature schemes that can be used to solve verifiability and already address privacy with respect to the verifier. In particular we construct one such scheme fine-tailored towards multivariate polynomials of degree two as well as another fine-tailored towards linear functions over multi-sourced data. The latter solution provides efficient verifiability even for computations over data authenticated by different cryptographic keys. Furthermore, we provide transformations for homomorphic signatures that add privacy. We first show how to add computational privacy and later on even information-theoretic privacy. In this way, we turn homomorphic signatures into function-dependent commitments. By applying this transformation to our homomorphic signature schemes we construct verifiable computing schemes with information-theoretic privacy.
Typ des Eintrags: | Dissertation | ||||
---|---|---|---|---|---|
Erschienen: | 2019 | ||||
Autor(en): | Schabhüser, Lucas | ||||
Art des Eintrags: | Erstveröffentlichung | ||||
Titel: | Information-Theoretic Privacy in Verifiable Outsourced Computation | ||||
Sprache: | Englisch | ||||
Referenten: | Buchmann, Prof. Dr. Johannes ; Ryan, Prof. Dr. Peter | ||||
Publikationsjahr: | 2019 | ||||
Ort: | Darmstadt | ||||
Datum der mündlichen Prüfung: | 24 April 2019 | ||||
URL / URN: | https://tuprints.ulb.tu-darmstadt.de/9052 | ||||
Kurzbeschreibung (Abstract): | Today, it is common practice to outsource time-consuming computations to the cloud. Using the cloud allows anyone to process large quantities of data without having to invest in the necessary hardware, significantly lowering cost requirements. In this thesis we will consider the following work flow for outsourced computations: A data owner uploads data to a server. The server then computes some function on the data and sends the result to a third entity, which we call verifier. In this scenario, two fundamental security challenges arise. A malicious server may not perform the computation correctly, leading to an incorrect result. Verifiability allows for the detection of such results. In order for this to be practical, the verification procedure needs to be efficient. The other major challenge is privacy. If sensitive data, for example medical data is processed it is important to prevent unauthorized access to such sensitive information. Particularly sensitive data has to be kept confidential even in the long term. The field of verifiable computing provides solutions for the first challenge. In this scenario, the verifier can check that the result that was given was computed correctly. However, simultaneously addressing privacy leads to new challenges. In the scenario of outsourced computation, privacy comes in different flavors. One is privacy with respect to the server, where the goal is to prevent the server from learning about the data processed. The other is privacy with respect to the verifier. Without using verifiable computation the verifier obviously has less information about the original data than the data owner - it only knows the output of the computation but not the input to the computation. If this third party verifier however, is given additional cryptographic data to verify the result of the computation, it might use this additional information to learn information about the inputs. To prevent that a different privacy property we call privacy with respect to the verifier is required. Finally, particularly sensitive data has to be kept confidential even in the long term, when computational privacy is not suitable any more. Thus, information-theoretic measures are required. These measures offer protection even against computationally unbounded adversaries. Two well-known approaches to these challenges are homomorphic commitments and homomorphic authenticators. Homomorphic commitments can provide even information-theoretic privacy, thus addressing long-term security, but verification is computationally expensive. Homomorphic authenticators on the other hand can provide efficient verification, but do not provide information-theoretic privacy. This thesis provides solutions to these research challenges -- efficient verifiability, input-output privacy and in particular information-theoretic privacy. We introduce a new classification for privacy properties in verifiable computing. We propose function-dependent commitment, a novel framework which combines the advantages of homomorphic commitments and authenticators with respect to verifiability and privacy. We present several novel homomorphic signature schemes that can be used to solve verifiability and already address privacy with respect to the verifier. In particular we construct one such scheme fine-tailored towards multivariate polynomials of degree two as well as another fine-tailored towards linear functions over multi-sourced data. The latter solution provides efficient verifiability even for computations over data authenticated by different cryptographic keys. Furthermore, we provide transformations for homomorphic signatures that add privacy. We first show how to add computational privacy and later on even information-theoretic privacy. In this way, we turn homomorphic signatures into function-dependent commitments. By applying this transformation to our homomorphic signature schemes we construct verifiable computing schemes with information-theoretic privacy. |
||||
Alternatives oder übersetztes Abstract: |
|
||||
Freie Schlagworte: | Solutions; S6 | ||||
URN: | urn:nbn:de:tuda-tuprints-90529 | ||||
Sachgruppe der Dewey Dezimalklassifikatin (DDC): | 000 Allgemeines, Informatik, Informationswissenschaft > 004 Informatik | ||||
Fachbereich(e)/-gebiet(e): | 20 Fachbereich Informatik 20 Fachbereich Informatik > Theoretische Informatik - Kryptographie und Computeralgebra |
||||
Hinterlegungsdatum: | 29 Sep 2019 19:56 | ||||
Letzte Änderung: | 02 Apr 2020 13:12 | ||||
PPN: | |||||
Referenten: | Buchmann, Prof. Dr. Johannes ; Ryan, Prof. Dr. Peter | ||||
Datum der mündlichen Prüfung / Verteidigung / mdl. Prüfung: | 24 April 2019 | ||||
Export: | |||||
Suche nach Titel in: | TUfind oder in Google |
Frage zum Eintrag |
Optionen (nur für Redakteure)
Redaktionelle Details anzeigen |