Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doesn't this loss function have the issue that the beginning time steps will get a much larger gradient than the final ones? #25

Open
RuABraun opened this issue Mar 6, 2021 · 1 comment

Comments

@RuABraun
Copy link

RuABraun commented Mar 6, 2021

I want to confirm that the issues I'm experiencing are a fundamental issue with the loss and not my implementation (which is a slight modification of this).

It seems to me that because the final loss is a sum of different paths, changing the (0, 0,) entry in the cost matrix will cause a much larger change in the loss than changing a later entry, as changing (0, 0,) influences every other entry in the cost matrix. Some simple test cases seem to confirm this. Can someone else confirm?

@RuABraun RuABraun changed the title Doesn't this loss functions have the issue that the beginning time steps will get a much larger gradient than the final ones? Doesn't this loss function have the issue that the beginning time steps will get a much larger gradient than the final ones? Mar 6, 2021
@v-nhandt21
Copy link

I want to confirm that the issues I'm experiencing are a fundamental issue with the loss and not my implementation (which is a slight modification of this).

It seems to me that because the final loss is a sum of different paths, changing the (0, 0,) entry in the cost matrix will cause a much larger change in the loss than changing a later entry, as changing (0, 0,) influences every other entry in the cost matrix. Some simple test cases seem to confirm this. Can someone else confirm?

Have you investigated the most robust and right soft DTW? I have tried it but the GPU is out of memory and I just can only set the batch size =1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants