Homomorphic cryptography is used when computations are delegated to an untrusted third-party.
However, there is a discrepancy between the untrustworthiness of the third-party and the silent assumption that it will perform the expected computations on the encrypted data.
This may raise serious privacy concerns, for example when homomorphic cryptography is used to outsource resource-greedy computations on personal data (e.g., from an IoT device to the cloud).
In this paper we show how to cost-effectively verify that the delegated computation corresponds to the expected sequence of operations, thus drastically reducing the necessary level of trust in the third-party.
Our approach is based on the well-known modular extension scheme:
it is transparent for the third-party and
it is not tied to a particular homomorphic cryptosystem
nor depends on newly introduced (and thus less-studied) cryptographic constructions.
We provide a proof-of-concept implementation,
THC (for "trustable homomorphic computation"),
which we use to perform security and performance analyses.
We then demonstrate its practical usability, in the case of a toy electronic voting system.