-
Notifications
You must be signed in to change notification settings - Fork 174
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support constrained Hilbert spaces in ARNN #1659
base: master
Are you sure you want to change the base?
Conversation
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## master #1659 +/- ##
==========================================
+ Coverage 82.68% 82.76% +0.08%
==========================================
Files 298 298
Lines 18299 18290 -9
Branches 3536 3519 -17
==========================================
+ Hits 15130 15138 +8
+ Misses 2481 2470 -11
+ Partials 688 682 -6 ☔ View full report in Codecov by Sentry. |
I am not the biggest fan of the proliferation of specialised logic in classes that are probably going to be subclassed by users (e.g. ARNNSequential, FastARNNSequential). But if it's just a flag, why not. |
When defining an ARNN wave function on a constrained Hilbert space, we can first define an unconstrained wave function, then set the amplitude of the states outside the constrained space to zero, and finally reweight the remaining states so the wave function is normalized.
There can be two ways to do the reweighting. The first way is to multiply a global constant onto all the states, which I call 'global reweighting' here. The second way is to reweight the conditional probabilities in each AR sampling step, as described in Hibat-Allah et al. (2020), Appendix D.2, which I call 'conditional reweighting'.
For example, if the unconstrained wave function on two spin-1/2 sites has
then the wave function with the constraint
total_sz = 0
and the global reweighting haswhile the one with the conditional reweighting has
The global reweighting works with not only ARNN but also other ansatzes, and it can be implemented by Markov chain samplers but not the direct sampler. On the other hand, the conditional reweighting works specifically with ARNN, and can be used with either Markov chain samplers or the direct sampler.
This PR implements the conditional reweighting in the ARNN model. We're planning to implement the global reweighting by a 'post processing' wrapper over any sampler, and it would be interesting to compare the two reweighting methods' expressive power and training convergence speed.