Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama integration not working and Groq not responding #132

Open
Himanshu-369 opened this issue May 17, 2024 · 7 comments
Open

Ollama integration not working and Groq not responding #132

Himanshu-369 opened this issue May 17, 2024 · 7 comments

Comments

@Himanshu-369
Copy link

Himanshu-369 commented May 17, 2024

Tried the previous solution
pip uninstall openui
pip install .

still the issue is same, now even the ollama integration is not working
and the GROQ_BASE_URL , GROQ_API_KEY makes no sense, because its making no change

@Himanshu-369
Copy link
Author

Ollama is also not working
set the api key to set OPENAI_API_KEY = xxx
started the ollama its up & running on port 11434, but not showing the model name in the openui

Attached Screenshots ::
image
image
image

@SFARPak
Copy link

SFARPak commented May 19, 2024

You need to pull some model to make Ollama appear.

ollama pull llama2

after pulling rerun the server

@Himanshu-369
Copy link
Author

You need to pull some model to make Ollama appear.

ollama pull llama2

after pulling rerun the server

Bro are you kidding I have models installed, someone would be an idiot if he is using ollama without models downloaded

@SFARPak
Copy link

SFARPak commented May 20, 2024

Its not discord or Facebook someone would be Idiot kidding on GitHub.
Which OS you are working on?
Please send the snapshot of both Ollama running and models dropdown in OpenUI.

@vanpelt
Copy link
Contributor

vanpelt commented May 20, 2024

@Himanshu-369 it looks you don't have the most recent changes. The UI is making a request to /v1/ollama/tags that's been replaced with /v1/models on the main branch. I just pushed some new changes to main and upgraded the version. Try pulling again re-installing. If you check your chrome inspector window on the network tab you'll find /v1/models and the response should contain a list of ollama models that are running.

Screenshot 2024-05-20 at 4 40 10 PM

P.S. I agree with @SFARPak. The tone in the previous message rubbed me the wrong way. We're all trying to help here, let's tone it down 🙏

@Himanshu-369
Copy link
Author

I want to apologize for the tone in my previous message. It was not my intention to come across as harsh or disrespectful to anyone @SFARPak @vanpelt . I appreciate all the efforts everyone is putting in to help and collaborate.

@SFARPak
Copy link

SFARPak commented May 28, 2024

I want to apologize for the tone in my previous message. It was not my intention to come across as harsh or disrespectful to anyone @SFARPak @vanpelt . I appreciate all the efforts everyone is putting in to help and collaborate.

No worries did you get to solve the issue?
I was asking you OS because on Windows environment variables are not working correctly, so you will need to install another package called python-decouple.

python -m pip install python-decouple

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants