-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how can i use ollama models in openui #87
Comments
No need for an API key. Just set |
I do not have an OpenAI API Key but do have my own ollama instance. If I remove the OPENAI_API_KEY var and set OLLAMA_HOST var to my ollama URL, the container fails to start, complaining about not having the openai_api_key var set or something. |
no no no, i don't use openui docker, just run the openui docker locally. |
I don't know if you have solved your problem already, but it seems similar to this issue. The solution worked for me. |
my openui is in ubuntu18 vmware workstation like 192.168.1.169,my ollama and models is in physical host like 192.168.1.103. how can i use ollama models in openui of vmware workstation.
The text was updated successfully, but these errors were encountered: