Delving Deeper into Secure Computation

Handling secure computation while respecting data owner privacy is considerably trickier than it might first appear. The main reason is that there are a number of stakeholders who have different needs. Let’s lay them out. First, there’s the data owner(s) who own the data which is to be computed on. There’s the secure computation purchaser who has paid for some secure computation to be performed on data. And finally there’s the compute node who will perform the actual computation itself.

The data owner would like to keep their data private. Ideally, they would prefer that the compute node is not able to view the raw data. This requires a technology such as fully homomorphic encryption, which will allow for computation to be performed on encrypted data. In addition, the data owner would like to limit the amount of information which the purchaser receives from their purchased computation so as to preserve privacy. This requires a technique such as differential privacy, which adds noise to datapoints before compute to prevent unnecessary information about the raw data from leaking out.

The compute node doesn’t have too much at stake here. It can be nosy, so it might store a copy of any data it happens to get access to. Worse, it can be malicious, so it may try to send nonsensical results to the purchaser if it can get away with. To prevent the compute node from getting away with this, a technology such as STARKs is required so that the purchaser has proof that the correct computation was performed on the correct data.

These needs together suggest a schema for truly effective secure computation. This scheme must allow for computation on encrypted data, the use of privacy techniques, and the ability to trustlessly verify computation. To put this into a crude equation, effective secure computation might require the following ingredients


SECURE_COMPUTATION = STARKS + HOMOMORPHIC_ENCRYPTION + DIFFERENTIAL_PRIVACY

Note that homomorphic encryption alone isn’t sufficient. There’s no privacy to the data owner from the purchaser provided by homomorphic encryption. STARKs alone aren’t sufficient, since the compute node can steal the raw data from the data owner. And differential privacy alone isn’t sufficient (how do you know the compute node didn’t just make up its answers?). However, if an effective scheme can be devised to combine these three technologies, it might be possible to enable more effective secure computation.

Note that even in this scheme, there’s one leakage of information. The compute node learns the program that the purchaser is attempting to run. For some applications, this might be infeasible. An interesting possibility is to have a “homomorphic interpreter” that can run a program which is itself encrypted, but it’s not currently clear how to design such a system.

1 Like

What if you need to apply these concepts only probabilistically? i.e., introduce something like a DataAudit primitive that would happen somehow on-ledger (publicly verifiable) that would require a Datatrust to clone some subset of its data to a mutually trusted third-party, and at a specific time the auditor could request both parties to compute a specific function chosen on-the-fly by the auditor; on receiving identical results from both parties the auditor can trust the Datatrust is computing correctly on the true data. You can then allow the Auditor to request the exact code to be executed and price his/her request accordingly.

This way some of the expensive computations can be paid probabilistically by an auditor as opposed to happening on every transaction. This could be quite handy if you are coordinating across potentially thousands of different datamarts and want to periodically audit that you trust the results they provide.

1 Like

@LRParser Welcome to the forums!

I definitely like the idea of probabilistic checking of computations. All of these primitives are very expensive right now, so checking only a subset of computations is much more feasible to deploy at scale than happening on every transaction.

The idea of having “auditors” who perform the service for a fee is intriguing as well.