You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I tried rag related experiments, I encountered some issues when replacing llama3 with my locally fine tuned large model, Ollama_ Types ResponseError: model 'llama3' not found, try pulling it first, but I have already changed the name,
Def get_rag_sssistant(
Llm_model: str="llama3: install",
Embeddings model: str="nic embedded text",
User_id: Optional [str]=None,
Run_id: Optional [str]=None,
Debug_mode: bool=True,
)->Assistant:
I don't understand why this is happening
When I changed it to the default llama3, it still reported an error. I want to know how to call olama in the source code and if it is possible to change its default location. I feel that the problem lies here
The text was updated successfully, but these errors were encountered:
Thank you for your reply. I have found that Ollama pull llama3 can solve this problem, but strangely, I don't even need to change the name of the model: llama: install (the name of the local large model I previously modified). So how should I use the locally fine tuned model? Since setting the name doesn't work
When I tried rag related experiments, I encountered some issues when replacing llama3 with my locally fine tuned large model, Ollama_ Types ResponseError: model 'llama3' not found, try pulling it first, but I have already changed the name,
Def get_rag_sssistant(
Llm_model: str="llama3: install",
Embeddings model: str="nic embedded text",
User_id: Optional [str]=None,
Run_id: Optional [str]=None,
Debug_mode: bool=True,
)->Assistant:
I don't understand why this is happening
When I changed it to the default llama3, it still reported an error. I want to know how to call olama in the source code and if it is possible to change its default location. I feel that the problem lies here
The text was updated successfully, but these errors were encountered: