You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have tried the example codes you provided as follows, and I noticed that when choosing bandwidth = 1 or 2, the loss values are all inf. Could you please help me solve this issue?
Sample codes:
from soft_dtw_cuda import SoftDTW
batch_size, len_x, len_y, dims = 8, 15, 12, 5
x = torch.rand((batch_size, len_x, dims), requires_grad=True)
y = torch.rand((batch_size, len_y, dims))
Thanks for posting and example @educationunion! This issue was previously reported in #8 but I didn't have a good minimal example to try. I had a quick look, and I think this is caused by the condition check around here that fails to work as expected when the length of the two sequences being compared is different. This is perhaps the result of adding support for such sequences in (this commit).
I think the condition has take the length of each sequence into account before skipping the calculation.
Hello @Maghoumi ,
I have tried the example codes you provided as follows, and I noticed that when choosing bandwidth = 1 or 2, the loss values are all inf. Could you please help me solve this issue?
Sample codes:
from soft_dtw_cuda import SoftDTW
batch_size, len_x, len_y, dims = 8, 15, 12, 5
x = torch.rand((batch_size, len_x, dims), requires_grad=True)
y = torch.rand((batch_size, len_y, dims))
sdtw = SoftDTW(use_cuda=False, gamma=0.1, bandwidth=2)
loss = sdtw(x, y)
loss
--- OUTPUT:
tensor([inf, inf, inf, inf, inf, inf, inf, inf], grad_fn=<_SoftDTWBackward>)
sdtw = SoftDTW(use_cuda=False, gamma=0.1, bandwidth=1)
loss = sdtw(x, y)
loss
--- OUTPUT:
tensor([inf, inf, inf, inf, inf, inf, inf, inf], grad_fn=<_SoftDTWBackward>)
Thank you very much.
Regards,
The text was updated successfully, but these errors were encountered: