You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We're definitely interested in adding more models!
As I understand it, the 2B and 7B architectures are roughly the same (just different sizes for the parameters). If you'd be interested in adding the model yourself, we'd gladly stamp it :) It should be fairly simple, just adding:
A new gemma_7b function in the gemma/_model_builders.py file with the appropriate sizes
A screenshot or attached W&B log showing that the model learns from a simple alpaca fine-tuning
Updating our models docs to show we now support it!
LMK if you have any questions or if this is something you'd like to take on.
Hi!
Are you considering adding support for Gemma 7B? It seems that it would be a great addition to the set of available models.
The text was updated successfully, but these errors were encountered: