Skip to main content

Videos & Presentations

Watch recorded videos and download the presentations…

VIDEO RECORDINGS

Presentation at Digitalize in Stockholm 2022

 

Research: Privacy-preserving data analysis. We apply tools from information theory to problems related to privacy-preserving data analysis
Speaker:
 Sara Saeidian, PhD student, saeidian@kth.se
Supervisors: Tobias J. Oechtering, Mikael Skoglund

Click here to watch the recorded video presentation on “Privacy-preserving data analysis”

 

OUR PRESENTATIONS

Quantifying Membership Privacy via Information Leakage

Sara Saeidian, Giulia Cervia, Tobias J. Oechtering, Mikael Skoglund, “Quantifying Membership Privacy via Information Leakage, IEEE Transactions Information Forensics and Security, Vol.16, pp. 3096-3108, 2021.

Machine learning models are known to memorize the unique properties of individual data points in a training set. This memorization capability can be exploited by several types of attacks to infer information about the training data, most notably, membership inference attacks. In this work, we propose an approach based on information leakage for guaranteeing membership privacy. Specifically, we propose to use a conditional form of the notion of maximal leakage to quantify the information leaking about individual data entries in a dataset, i.e., the entrywise information leakage.

We apply our privacy analysis to the Private Aggregation of Teacher Ensembles (PATE) framework for privacy-preserving classification of sensitive data and prove that the entrywise information leakage of its aggregation mechanism is Schur-concave when the injected noise has a log-concave probability density. The Schur-concavity of this leakage implies that increased consensus among teachers in labelling a query reduces its associated privacy cost. We also derive upper bounds on the entrywise information leakage when the aggregation mechanism uses Laplace distributed noise.

DOWNLOAD THE PRESENTATION HERE: Quantifying Membership Privacy via Information Leakage