Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can we have a model with different embedding size if we finetune on our data? #414

Closed
benam2 opened this issue May 8, 2024 · 3 comments
Closed

Comments

@benam2
Copy link

benam2 commented May 8, 2024

Hi, and sorry if this question comes across as naive. I'm aiming for a significantly smaller embedding size for my project. and I'm wondering if we could tweak the architecture to achieve a dimension of 100 or even less for my images, as opposed to the 384, 768... offered by the current model.

Is there a quick way to do this just to measure the impact on my results? Thanks in advance!

@benam2 benam2 changed the title Question: can we have a model with different embedding size if we finetune on our data? Can we have a model with different embedding size if we finetune on our data? May 8, 2024
@benam2
Copy link
Author

benam2 commented May 9, 2024

Any idea on this please @qasfb

@qasfb
Copy link
Contributor

qasfb commented May 13, 2024

you can plug the model into a linear layer that outputs 100 channels, and learn that linear layer on your task (if your goal is to store a set of embeddings)
otherwise if you want a smaller transformer model with embedding dimension 100, then there is no quick way

@qasfb qasfb closed this as completed May 13, 2024
@benam2
Copy link
Author

benam2 commented May 20, 2024

@qasfb Sorry for opening this issue again. But based on your experience do you think adding a layer with 100 channels and learning that on my task can give the similar eprformance for the retrieval tasks? Just want to pick your brain on that.

Also, what would be not quick approach for a smaller transformer model with 100 embedding size; just giving a rouh idea also helps a lot. Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants