Skip to content

slohani-ai/data-centric-in-qis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Logo

Data-centric ML in Quantum Information Science

Author: Sanjaya Lohani

*Please report bugs at slohani@mlphys.com

Thanks to Brian T. Kirby, Ryan T. Glasser, Sean D. Huver and Thomas A. Searles

Preprint:

Lohani, S., Lukens, J.M., Glasser, R.T., Searles, T.A. and Kirby, B.T., 2022. Data-Centric Machine Learning in Quantum Information Science. arXiv preprint arXiv:2201.09134.

Built With

Getting Started

pip install mlphys

Usage

Simulating various distributions and measurements (including inferences)

import mlphys.deepqis.simulator.distributions as dist
import mlphys.deepqis.simulator.measurements as meas
import mlphys.deepqis.utils.Alpha_Measure as find_alpha
import mlphys.deepqis.utils.Concurrence_Measure as find_con
import mlphys.deepqis.utils.Purity_Measure as find_pm
import mlphys.deepqis.network.Inference as inference
import mlphys.deepqis.utils.Fidelity_Measure as fm
...

Sub-module

Tutorials

For examples (google colab), please refer to

Hands-on coding examples for the results

  • Reducing spurious correlations:

    • Accuracy of entanglement-separability classification - Fig 2 (a)
    • network reconstruction fidelity versus the percentage of separable states added to a training set containing entangled states - Fig 2 (b)
    • Reconstruction fidelity for test states from the MA distribution for a MEMS-only trained network and after adding a small fraction of separable states into the training set - Fig 2 (c, d)
  • Reconstruction fidelity versus number of trainable parameters for various training set distributions:

    • Data-centric approach (Fidelity versus trainable parameters) - Fig 3 (a)
    • The concurrence and purity of random quantum states from the Hilbert–Schmidt–Haar (HS–Haar), Zyczkowski (Z), ˙ engineered, and IBM Q distributions - Fig 3 (a) insets
  • Engineered states on concurrence-purity plane:

  • Data-centric approach in the low-shot regime:

    • Reconstructing the NISQ-sampled distribution with simulated measurements performed with shots ranging from 128 to 8192 - Fig 4
  • Heterogeneous state complexity:

  • Optimizing learning rate:

  • Engineered states:

  • Reconstruction fidelity of NISQ-sampled test set versus the mean purity of various MA-distributed training states when K = 4.

    • The mean purity of the training set matches the minimum and mean purity of the NISQ sampled states when D = K = 4 - Extended Data Fig. 3
  • Reconstruction fidelity versus trainable parameters for various MA-distributed training sets:

    • The pairs of concentration parameter and K-value are chosen as (α, K) ∈ {(0.01, 4),(0.1, 4),(0.3, 4),(0.8, 4),(0.3394, 6)} for training sets - Extended Data Fig. 4