-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Autodifferentiation support for PytorchBackend
#1276
base: master
Are you sure you want to change the base?
Conversation
pytorch
backend
pytorch
backendPytorchBackend
The only differentiation method that was not working for the pytorch backend was the parameter shift rule. |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #1276 +/- ##
==========================================
- Coverage 99.82% 99.75% -0.07%
==========================================
Files 72 72
Lines 10560 10645 +85
==========================================
+ Hits 10541 10619 +78
- Misses 19 26 +7
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
There's also all the classes inside |
ok, I forgot these tests, I will try to fix them tomorrow |
I have encountered a problem while coding the SGD training procedure of the VQE for the pytorch backend. |
Do you mean this cast? Because this is overloaded by the |
That cast is fine. The problem is in the two lines before: cos = self.np.cos(theta / 2.0) + 0j
isin = -1j * self.np.sin(theta / 2.0) when theta is a torch tensor with requires_grad = True you can't compute cos/sin using numpy. |
but then why isn't |
I can try to fix things this way. With Renato we just wanted to have the matrices defined with numpy and then casted to the correct backend. |
This solution should be fine |
Complete pytorch backend, close issue #1266
Checklist: