Skip to content
This repository has been archived by the owner on Nov 11, 2023. It is now read-only.

[mps] issue with Apple silicon compatibility #170

Open
1 of 2 tasks
magic-akari opened this issue Apr 18, 2023 · 1 comment
Open
1 of 2 tasks

[mps] issue with Apple silicon compatibility #170

magic-akari opened this issue Apr 18, 2023 · 1 comment
Labels
bug? The issue author think this is a bug

Comments

@magic-akari
Copy link
Contributor

magic-akari commented Apr 18, 2023

OS version

Darwin arm64

GPU

mps

Python version

Python 3.8.16

PyTorch version

2.0.0

Branch of sovits

4.0(Default)

Dataset source (Used to judge the dataset quality)

N/A

Where thr problem occurs or what command you executed

inference

Situation description

Tips:

  • use PYTORCH_ENABLE_MPS_FALLBACK=1
  • use -d mps

issues:

related codes:

is_half = rad_values.dtype is not torch.float32
tmp_over_one = torch.cumsum(rad_values.double(), 1) # % 1 #####%1 means the following cumsum can no longer be optimized
if is_half:

cumsum_shift[:, 1:, :] = tmp_over_one_idx * -1.0
rad_values = rad_values.double()
cumsum_shift = cumsum_shift.double()
sine_waves = torch.sin(torch.cumsum(rad_values + cumsum_shift, dim=1) * 2 * np.pi)

There are some double type casts in the source code. Is it required?

Some methods related to double are not implemented in mps devices.

I think float is enough, but I am not sure.
I have modified and tested locally, and it works well.

Is there a significant loss of precision in moving the torch.cumsum operation from double to float?

CC: @ylzz1997

Log

N/A

Supplementary description

No response

@magic-akari magic-akari added the bug? The issue author think this is a bug label Apr 18, 2023
@ylzz1997
Copy link
Contributor

In theory, it doesn't matter. F0 normalized to 0-1 should be accurate to 1e-7, flaot just right.

But the cumsum operation after the float type converted to double is written by the person who wrote NFS_HIFIGAN. What exactly is the effect that can be raised issue under the NFS_HIFIGAN project.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug? The issue author think this is a bug
Projects
None yet
Development

No branches or pull requests

2 participants