Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to freeze params of an embedding table in EmbeddingBagCollection? #1990

Open
tiankongdeguiji opened this issue May 13, 2024 · 3 comments

Comments

@tiankongdeguiji
Copy link

No description provided.

@tiankongdeguiji
Copy link
Author

Hi, @henrylhtsang @IvanKobzarev @joshuadeng @PaulZhang12 can you see this problem?

@colin2328
Copy link
Contributor

a few ways you could do this:
the simplest way would be to constrain (through parameter constraints) your EBC to be dense (non fused) and not to set up an optimizer for the tables you want to freeze.
You could also try setting the lr to 0 if you are interested in freezing EBCs mid training.

Keep in mind htat if you have only some tables you want to freeze (and some you want to always be trainable) that you wil likely want to create separate EBCs for each (otherwise the sharder might combine them)

@tiankongdeguiji
Copy link
Author

a few ways you could do this: the simplest way would be to constrain (through parameter constraints) your EBC to be dense (non fused) and not to set up an optimizer for the tables you want to freeze. You could also try setting the lr to 0 if you are interested in freezing EBCs mid training.

Keep in mind htat if you have only some tables you want to freeze (and some you want to always be trainable) that you wil likely want to create separate EBCs for each (otherwise the sharder might combine them)

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants