'How to quantify privacy, when using homomorphic encryption?

How can you measure how secure or private the new variables are relative to the real (actual) variables.

I want to compare homomorphic encryption and differential privacy in combination with machine learning models. Maybe using a measures like Kullback-Leibler, but I will need the distribution of the encrypted variables.

I'm using the TenSEAL python package and random forest.

Does anyone know a python package or function in TenSEAL to compare the encrypted values with the real values of compute the distribution of the encrypted values?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source