Homomorphic encryption: The new kid on the block

Dr Ian Cutress writes:

Take for example, analyzing medical data records: if a researcher needs to process a specific data-set for some analysis, the traditional method would be to encrypt the data, send the data, decrypt the data, and process it – but giving the researcher access to the specifics in the records might not be legal or face regulatory challenges. With FHE, that researcher can take the encrypted data, perform the analysis and get a result, without ever knowing any specifics of the dataset. This might involve combined statistical analysis of a population over multiple encrypted datasets, or taking those encrypted datasets and using them as additional inputs to train machine learning algorithms, enhancing the accuracy through having more data. Of course, the researcher has to have trust that the data given is complete and genuine, however that is arguably a different topic than enabling compute on encrypted data.

The reason why this development is so important is because it “might” lead to the demise of “differential privacy” as touted by Apple. While their M1 silicon based on the ARM design is getting rave reviews, it has the potential to improve the AI calculations in an energy efficient manner. This development appears important (as highlighted) because enterprises (especially healthcare) will generate even more data with the eventual digitisation.

I am not going into the remainder of the write up, but this development (and possible innovation) has a huge bearing on further research. I will attempt to keep an eye on this.