Skip to content
This repository has been archived by the owner on Jan 24, 2024. It is now read-only.

Tried multiple different models but get "The model weights are not tied..." error every time.. #266

Open
jontstaz opened this issue Sep 30, 2023 · 0 comments
Labels
question Further information is requested

Comments

@jontstaz
Copy link

Hi,

I'm running Basaran via Docker and I have now tried using several different models at this point but every time after its downloaded and load everything, I'm facing with this error: The model weights are not tied. Please use the `tie_weights` method before using the `infer_auto_device` function

Am I missing something? I've tried multiple GPTQ models from TheBloke and even the official Llama2-13b model but this error is thrown every single time regardless of the model and it prevents me from using Basaran at all.

Any help would be appreciated. Thanks in advance.

@fardeon fardeon added the question Further information is requested label Oct 8, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants