Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Local-model not working, as it expects openai api key. #13

Closed
bcosculluela opened this issue Mar 11, 2024 · 3 comments
Closed

Local-model not working, as it expects openai api key. #13

bcosculluela opened this issue Mar 11, 2024 · 3 comments
Assignees
Labels
bug Something isn't working

Comments

@bcosculluela
Copy link

Hello!
Regarding this issue, I am currently using LM Studio. When using local-model, it does not work. As I can see in code, in interpreter_lib.py, line 324, variable custom_llm_provider is set to 'openai', so it expects de openai api key. Which has to be the value of this variable when using open-source LLMs as Mistral?

@haseeb-heaven
Copy link
Owner

Okay i will take a look at this issue today.

@haseeb-heaven haseeb-heaven self-assigned this Mar 12, 2024
@haseeb-heaven haseeb-heaven added the bug Something isn't working label Mar 12, 2024
@bcosculluela
Copy link
Author

Thanks! In my case, the issue has been solved by setting:
model = "ollama/llama2"
And removing the variable custom_llm_provider.
Just in case it helps! 😄

@haseeb-heaven
Copy link
Owner

Fix this bug in this PR : #14

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants