Skip to content

Privacy-Preserving Distributed Expectation Maximization for Gaussian Mixture Models via Fully Homomorphic Encryption

Notifications You must be signed in to change notification settings

HanaHasan04/PPEM-for-GMM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

57 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Privacy-Preserving Distributed Expectation Maximization for Gaussian Mixture Models

Supervised by prof. Adi Akavia, Secure Cloud Computing Laboratory, Fall 2022-2023

In recent years, cloud computing has emerged as a widely adopted and cost-effective solution for storing and processing large volumes of data. However, one of the major challenges in secure cloud computing is the need to preserve the privacy of sensitive data while allowing for meaningful analysis.
In our project, we present a novel approach to address these by proposing a privacy-preserving distributed expectation-maximization algorithm for Gaussian mixture models. Utilizing fully homomorphic encryption, the proposed method enables centralized federated learning while preserving the privacy of sensitive data.

Step-by-Step Guide:

  • Read the Background document, which provides an overview of the fundamental concepts behind Gaussian mixture models, the expectation maximization algorithm, and fully homomorphic encryption.
  • View the 2D visualization in the Colab Notebooks.
  • Read the Proposed Approach to understand our solution in detail and view the results.
  • View the source code of GMM: EM vs. PPEM.
  • Take a look at our final presentation.

Requirements:

  • numpy
  • matplotlib
  • scipy
  • tenseal

image

About

Privacy-Preserving Distributed Expectation Maximization for Gaussian Mixture Models via Fully Homomorphic Encryption

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published