Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dbi cost functions #1269

Merged
merged 135 commits into from
Jun 3, 2024
Merged

Dbi cost functions #1269

merged 135 commits into from
Jun 3, 2024

Conversation

wrightjandrew
Copy link
Contributor

@wrightjandrew wrightjandrew commented Mar 16, 2024

Features include:

Changes in double_bracket.py which has new cost functions

These are used via the loss function

if self.cost is DoubleBracketCostFunction.off_diagonal_norm:
loss = self.off_diagonal_norm
elif self.cost is DoubleBracketCostFunction.least_squares:
loss = self.least_squares(d)
elif self.cost == DoubleBracketCostFunction.energy_fluctuation:
loss = self.energy_fluctuation(self.ref_state)
# set back the initial configuration
self.h = h_copy
return loss

Scheduling strategies for find the local minimum given a bracket $W=[D,H]$.

These are implemented in utils_scheduling.py and include:

  • grid_search_step
  • hyperopt_step
  • polynomial_step
  • simulated_annealing_step

Gradient descent methods for finding $D$ operators

This is done in utils_gradients.py. There are 2 ways of finding gradients defined by how we parametrize $D$.
We parametrize the ansatz for $D$ based on two possible choices of representing a matrix

  • the computational basis representation of the matrix $A = \sum_{i,j=0}^{2^L} A_{i,j} |i\rangle\langle j|$ where $L$ is the number of qubits.
  • the Pauli operator basis using the orthogonality in the Hilbert-Schmidt scalar product $A = \sum_{\mu,\nu} A_{\mu,\nu} \otimes_{k=1}^L (Z^{\mu_k}X^{\nu_k})$ where $A_{\mu,\nu} = \langle A, \otimes_{k=1}^L (Z^{\mu_k}X^{\nu_k})\rangle_{HS}$

More specifically we have the cases for parametrizing diagonal matrices $D$ as

  • via coupling strengths of local Pauli operators $D(B,J) = \sum_i B_i Z_i +\sum_{i,j} J_{i,j} Z_i Z_j$
  • via matrix elements in the computational basis $D(d) = \sum_i d_i |i\rangle\langle i|$.

The comutational basis ansatz is there for completeness, once an operator $D$ is found it has to be compiled and so in any case a Pauli basis representation will be used. We recommend using gradient descent directly in the Pauli basis representation. In principle the full optimization in the computational basis could allow for finding operators that outperform an Ising model ansatz but then the operator may be inefficient to compile.

For these parametrizations there is the associated gradient descent method to find the numerical values of the parameters $(B,J)$ or $(d)$
See https://github.com/qiboteam/qibo/blob/dbi_cost_functions/examples/dbi/dbi_cost_functions_and_d_gradients_tutorial.ipynb for results

  • For $D(B,0)$ ie the 1-local magnetic field we have the default
    gradient_descent_dbr_pauli_basis - the naming convention reflects that we perform the gradient descent aiming to optimize a double-bracket rotation and we parametrized the ansatz in the Pauli basis
  • For the element wise search of the $D(d)$ operator we have
    d_opt, loss_opt, grad_opt, diags_opt = gradient_descent_dbr_computational_basis(dbi, params, 100,lr=1e-2, d_type = d_ansatz_type.element_wise)

Checklist:

  • Reviewers confirm new code works as expected.
  • Tests are passing.
  • Coverage does not decrease.
  • Documentation is updated.

Still some work to be done but just wanted to try how the pull request works and see if I'm doing things correctly

Sam-XiaoyueLi and others added 30 commits January 25, 2024 14:17
…_scheduling_polynomial; test_double_bracket_iteration_scheduling_grid_hyperopt
Co-authored-by: Edoardo Pedicillo <edoardo.pedicillo@gmail.com>
@@ -162,3 +101,58 @@ def cs_angle_sgn(dbi_object, d):
)
)
return np.sign(norm)


def decompose_into_Pauli_basis(h_matrix: np.array, pauli_operators: list):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Sam-XiaoyueLi please check that including this will not impact some function using it?

@Sam-XiaoyueLi Sam-XiaoyueLi added this pull request to the merge queue Jun 3, 2024
Merged via the queue into master with commit 2758cd2 Jun 3, 2024
13 of 25 checks passed
@marekgluza marekgluza deleted the dbi_cost_functions branch June 3, 2024 04:09
@marekgluza
Copy link
Contributor

@scarrazza that was a long one! :) Many thanks among others to @wrightjandrew for the initiative on adding cost functions that help gradient descent and to @Sam-XiaoyueLi for suggesting the new optimizers and implementing the strategies + tests (there's a lot of things that went into this multi branch PR). Also thanks @andrea-pasquale for helping us finalize the tests!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants