New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move autocast op list to autocast_mode.h to make sure other backends can reuse it. #125114
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/125114
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 31ac082 with merge base 68a1f78 (): This comment was automatically generated by Dr. CI and updates every 15 minutes. |
38abf49
to
96ffe20
Compare
In my opinion, it would be better to leave it as is. Because each backend supports Autocast differently |
Yes, if the backend has different autocast support list, it can just do not use these macros. For those who support the same op list, they can reuse these macros. |
Moving the list to .h can improves reusability and make it a less backend specific thing. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That sounds fair since it won't change compile time and we don't expect this structure to move in the future.
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Merge failedReason: 1 mandatory check(s) failed. The first few are: Dig deeper by viewing the failures on hud |
@pytorchbot rebase |
@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here |
Successfully rebased |
96ffe20
to
31ac082
Compare
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
This PR refactors the op list added in #124051. To make sure other backends can reuse it.
cc @mcarilli @ptrblck @leslie-fang-intel @jgong5 @guangyey