Skip to content

This Project was part of my Master Thesis: "A reconstruction method for electrical impedance tomography based on machine learning and real measurement data"

Notifications You must be signed in to change notification settings

LarsG21/EIT_reconstruction

Repository files navigation

EIT_Reconstruction

Description

This Project was part of my Master Thesis: "A reconstruction method for electrical impedance tomography based on machine learning and real measurement data".\

Electrical impedance tomography (EIT) is a cost-effective and non-invasive imaging technique that makes it possible to determine the spatial impedance distribution of a body. This method is used in scientific, industrial and medical contexts. One example of this is the monitoring of lung ventilation.
Many approaches to EIT image reconstruction already exist. They are divided into model-based and data-based approaches. In recent years, data-based approaches have proven to be increasingly promising. This requires large data sets of training data. These are currently usually generated by simulations. This work aims to test whether this is also possible with a data set from real measurement data.
First, an experimental setup is developed to be able to generate a corresponding data set. This setup includes an EIT device, specifically the EIT32 model from ScioSpec, and a Plexiglas tank with water whose impedance distribution is recorded. In addition, a positioning system is used for objects that are inserted into the tank to generate different impedance distributions.\

In the following image you can see the experimental setup: experimantal_setup.png

The Data generation Process is shown in the following image: data_generation.png The experimental setup is then used to collect over 14 000 samples. Three types of models are then evaluated and compared with this data set. A linear- and a K-Nearest Neighbors (KNN) regression, as well as a neural network. In addition, the possibility of data augmentation and dimensionality reduction on EIT data and their effects are analyzed. The methods of noise-, rotation-, Gaussian blur- and superposition-augmentation are used. The possibility of training data-based models on real measurement data was confirmed. The neural networks performed up to 28 % better in relevant metrics than the linear- or K-nearest neighbor regression approaches. It was also found that the quality of the data set significantly influences the generalization ability of the models. For this reason, it was investigated whether the variation of the data can be improved through augmentation. It was found that the quality of the models can be improved by augmentation, especially with small amounts of data (120 samples). On the one hand, it was shown that the performance of the algorithms can be significantly increased by up to 50 % by using noise and rotation as augmentation methods. In addition, using Gaussian blur augmentation improved the visual impression of the reconstructions. It was successfully demonstrated that superposition augmentation enables models to be better generalized to complex scenarios.

Short Instructions

Installation

To install the required packages, you can use the following command:

pip install -r requirements.txt

Overview:

  • Controlling of the GCODE Device is handled in GCodeDevice.py.
  • data handling of the EIT Devices is done in data_reader.py
  • Most of the rest of the Code is used for data generation, training and evaluation of the models.

Usage

Data Generation

General Procedure is:

  1. Start the EIT32 Software and run a measurement with your preferred settings (choose output folder)
  2. Run the collect_real_data.py script to collect the data (enter the correct settings and output path)

settings.png

  1. The script will control the 3D Printer to move the object in the tank and collect the data
  2. The data will be saved in the output folder as an pickle file
General Tips:
  • In many occasions it is necessary to choose between absolute eit or relative eit as a parameter of the function!
You can generate sample data by running the following scripts:
Before you can use the date in training, you can combine multiple files with the following script:

Data

The default data location is: Collected_Data or Collected_Data_Experiments.
Recommend to move data used for Training to a new Folder like Training_Data to separate newly collected data from the training data.

Training

Training will use GPU if available (Some manual adjustments in the code might be necessary).

The Scripts used for training are located at Model_Training:

Training Procedure
  1. Choose the model you want to train in the Model_Training_with_pca_reduction_copy.py script
  2. Choose the location of the training data
  3. Choose the augmentations and the PCA reduction in the script
  4. Run the script and enter the correct parameters
  5. The script will train the model and save it to the specified location
  6. The script will also save the training history and the model architecture
Augmentations

There are multiple types of augmentations available:

  • Noise
  • Gaussian Blur
  • Rotation rot_aug.png
  • Superposition superpos.png

Evaluation

The Scripts used for evaluation are located at Evaluation.:

Usage of a trained model

Sample usages can be found in:

Comparison Algorithms can be found in:

Try_Other_Reconstruction_methodes.py

Future ideas:

Further Information

For more in depth info you can look at my Master Thesis.

About

This Project was part of my Master Thesis: "A reconstruction method for electrical impedance tomography based on machine learning and real measurement data"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages