Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to use Nx indexed functions with Torchx :mps backend #1107

Closed
megaboy101 opened this issue Feb 19, 2023 · 3 comments
Closed

Unable to use Nx indexed functions with Torchx :mps backend #1107

megaboy101 opened this issue Feb 19, 2023 · 3 comments
Labels
area:torchx Applies to Torchx

Comments

@megaboy101
Copy link

megaboy101 commented Feb 19, 2023

I've been experimenting with Nx this past week and since I'm working on an M1 Mac I've been swapping between the :cpu and :mps devices in Torchx to compare relative performance.

I noticed when trying to use some of the indexed functions like Nx.indexed_add() and Nx.indexed_put() with the :mps device, I receive the following error:

Nx.indexed_put(Nx.tensor([0, 0, 0]), Nx.tensor([[1], [2]]), Nx.tensor([2, 4]))
** (RuntimeError) Torchx: All inputs of where should have same/compatible number of dims in NIF.where/3
    (torchx 0.5.1) lib/torchx.ex:445: Torchx.unwrap!/1
    (torchx 0.5.1) lib/torchx.ex:448: Torchx.unwrap_tensor!/2
    (torchx 0.5.1) lib/torchx/backend.ex:1403: Torchx.Backend.select/4
    (torchx 0.5.1) lib/torchx/backend.ex:536: Torchx.Backend.as_torchx_linear_indices/2
    (torchx 0.5.1) lib/torchx/backend.ex:504: Torchx.Backend.indexed/5

Note: This works fine then using the :cpu device, only the :mps device seems to cause these issues

@josevalim josevalim added the area:torchx Applies to Torchx label Feb 19, 2023
@josevalim
Copy link
Collaborator

Yes, the :mps backend is experimental on both LibTorch itself and in Nx, so it is expected you will run into issues. A pull request is welcome!

@megaboy101
Copy link
Author

Ah gotcha, ya I'd definitely be down to take a crack at a PR. Appreciate the clarification!

@josevalim
Copy link
Collaborator

Closing this, as it is a mps issue and support has to be added on LibTorch itself (which was most likely already done!).

@josevalim josevalim closed this as not planned Won't fix, can't repro, duplicate, stale May 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:torchx Applies to Torchx
Projects
None yet
Development

No branches or pull requests

2 participants