Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error using Swap Voice and Clone Voice on Apple Silicon M2 #77

Open
winterhuette opened this issue Sep 17, 2023 · 2 comments
Open

Error using Swap Voice and Clone Voice on Apple Silicon M2 #77

winterhuette opened this issue Sep 17, 2023 · 2 comments

Comments

@winterhuette
Copy link

I receive the following error when starting webui.py -enablemps on my Mac mini with M2 CPU. This only happens with Swap or Clone Voice. TTS works fine.

File "/Users/jan/anaconda3/lib/python3.11/site-packages/torch/serialization.py", line 165, in validate_cuda_device
raise RuntimeError('Attempting to deserialize object on a CUDA '
RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.

I does not seem to load the Hubert_model into MPS but tries to use CUDA (that I do not have).
Do I need to adjust the code somewhere? I thought it would be enough to set the -enablemps switch.

Thank you very much for your support.

@winterhuette
Copy link
Author

I solved this by replacing the line "model.load_state_dict(torch.load(path))" with "model.load_state_dict(torch.load(path, map_location='mps'))" in file "customtokenizer.py" in folder "/bark-gui/bark/hubert/". Hope this helps others.

@denisroldan
Copy link

This worked for me on a M1 Pro 😉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants