Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about average_frames and reduction parmas #27

Open
wl-junlin opened this issue Sep 10, 2021 · 1 comment
Open

Question about average_frames and reduction parmas #27

wl-junlin opened this issue Sep 10, 2021 · 1 comment

Comments

@wl-junlin
Copy link

I want to have a stable loss which is rubust to labels_lengths when training.
What value should I pass to this two parmas?

What's more, what is the approximate relationship between loss and actual wer?
For example, if I want a wer aroud 0.5. How much should be the value of the loss?

@1ytic
Copy link
Owner

1ytic commented Sep 12, 2021

You shouldn't average over frames. If I remember correctly, theoretically it doesn't make sense. The loss is calculated for the entire utterance.

There is no a direct link between the RNN-T loss value and WER. I think a good analogue would be the negative log-likelihood and the accuracy of a classifier.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants