Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batchnorm and masking #46

Open
danpovey opened this issue Nov 9, 2020 · 4 comments
Open

Batchnorm and masking #46

danpovey opened this issue Nov 9, 2020 · 4 comments

Comments

@danpovey
Copy link

danpovey commented Nov 9, 2020

It looks like the batchnorm doesn't take into account the masking:

x = F.relu(self.bn(self.tdnn(x)))

Surely this isn't right?
However I don't know how to take it into account.

@freewym
Copy link
Owner

freewym commented Nov 9, 2020

I think Batchnorm is per dimension, so the masked part will not affect the unmasked part?

@danpovey
Copy link
Author

danpovey commented Nov 9, 2020 via email

@freewym
Copy link
Owner

freewym commented Nov 9, 2020

Oh OK. How does Kaldi deal with it? Guarantee same length within a batch so no padding?

@danpovey
Copy link
Author

danpovey commented Nov 9, 2020 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants