Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement cross-device reductions using Jet #57

Open
mlxd opened this issue Jul 30, 2021 · 0 comments
Open

Implement cross-device reductions using Jet #57

mlxd opened this issue Jul 30, 2021 · 0 comments
Labels
enhancement ✨ New feature or request

Comments

@mlxd
Copy link
Member

mlxd commented Jul 30, 2021

Feature description

Currently Jet allows the slicing of a given tensor network to allow distribution over multiple devices: each device contracts its part of the network, be they CPU cores (BLAS backed contractions), or GPUs (cuTENSOR backed contractions). Currently no mechanism exists for reductions across different device types. The goal will be to implement a reduction task that allows efficient CPU-GPU, and GPU-GPU contractions.

  • Context: Enable reductions across mixed devices for tensor network contractions.

  • Factors: Performance should be equal to or better than a single device alone.

Tasks:

  • Implement a reduction task for CPU-GPU, and GPU-GPU contractions based on the current main branch.
  • Add tests to verify the reduction is performed correctly.
  • Compare runtimes using the CPU-only (default) backend.
@Mandrenkov Mandrenkov changed the title Implement cross-device reductions using Jet, Implement cross-device reductions using Jet Aug 9, 2021
@trevor-vincent trevor-vincent added the enhancement ✨ New feature or request label Aug 31, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement ✨ New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants