Skip to content

This repository contains a project showcasing Federated Learning using the EMNIST dataset. Federated Learning is a privacy-preserving machine learning approach that allows a model to be trained across multiple decentralized devices or servers holding local data samples, without exchanging them.

License

Aayush711/Federated-Learning-Project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Federated_Learning

Federated Learning is a machine learning approach that allows a model to be trained across multiple decentralized devices or servers holding local data samples, without exchanging them. Instead of sharing raw data, only model updates are sent back to a central server, ensuring data privacy. It's useful for sensitive data (like healthcare records), edge devices, and scenarios with limited bandwidth. Challenges include communication overhead and data distribution. It finds applications in healthcare, finance, IoT, and personalized predictions. Ongoing research is focused on addressing its challenges and expanding its use cases.

Key Features:

  • Demonstrates Federated Learning with the EMNIST dataset.
  • Utilizes TensorFlow and TensorFlow Federated for implementation.
  • Provides examples of data preprocessing, model creation, and training on federated data.
  • Visualizes training progress using TensorBoard.

Dependencies

  • TensorFlow
  • TensorFlow Federated
  • Matplotlib

To install the dependencies, run the following commands in your environment:

!pip install --quiet --upgrade tensorflow
!pip install --quiet --upgrade tensorflow-federated

Loading the Dataset

The EMNIST dataset consists of handwritten character digits. It is downloaded and prepared for training in the code.

emnist_train, emnist_test = tff.simulation.datasets.emnist.load_data()

Example of Data Received by a Client

This section provides an example of the data a client receives from the dataset.

Train Client Model (Local Model)

This section demonstrates how the model is trained on each client's data locally.

Preprocessing the Input Data

This section preprocesses the input data before it is used for training.

Creating a Model with Keras

The model architecture used for this Federated Learning project is based on a Keras Sequential model with Dense layers.

Training the Model on Federated Data

This section covers the training process on federated data. It uses a weighted federated averaging algorithm.

Result of Loss and Accuracy

The code provides the result in terms of a matrix, displaying loss and accuracy metrics.

Visualizing Training Progress

The training progress is visualized using TensorBoard, with scalar summaries for each training round.

Usage

To run this project, follow these steps:

  • Install the required dependencies.
  • Execute the code cells in the provided Colab notebook.

Contributing

If you'd like to contribute to this project, please follow these steps:

# Fork the repository.
- Create a new branch (`git checkout -b feature/your-feature`).
- Make your changes and commit them (`git commit -m 'Add some feature'`).
- Push to the branch (`git push origin feature/your-feature`).
- Create a new Pull Request.

License

This project is licensed under the MIT License.

About

This repository contains a project showcasing Federated Learning using the EMNIST dataset. Federated Learning is a privacy-preserving machine learning approach that allows a model to be trained across multiple decentralized devices or servers holding local data samples, without exchanging them.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published