New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Ollama #646
base: main
Are you sure you want to change the base?
Add Ollama #646
Conversation
I guess it would also be fine to have it as a custom model provider in a separate package, if we don't want to maintain it here in this repo. |
Jeremy, thank you very much for working on this PR and for keeping up with the open source LLM ecosystem. I lack the time to engage there as much as I would like. Let me know when this PR is ready, and I will approve & merge it once I run it locally. 👍 |
Looking forward to this integration. |
Thanks @dlqqq and @lalanikarim! Yeah I'll try to finish the PR soon. It was originally opened early to see if there was interest to have built-in support for Ollama in
Currently jupyter-ai/packages/jupyter-ai/jupyter_ai/extension.py Lines 61 to 75 in 642ac53
Although not sure if that would be enough to use models that have not been added to the list of models in |
🙇 @jtpio excited for this |
@lalanikarim - I am also replicating a similar setup. Just curious are you able to use the /learn, /generate commands in the chat |
I haven't had any luck with
I haven't tried |
@jtpio I am wondering if it would make sense to provide model name in a free form text field for Ollama models and to require %%ai magic code to include model name for ollama models and limit models to a predefined list.
Thoughts? |
Yeah I guess given the number of available models it would be very difficult to pick just a few here in the So yes having a way to let users configure the list of models sounds like a good idea. |
Hi, Thanks for the hack. Its working. But curious.. how/what did you set the OPENAI API key ? Its working inside the chat box (Jupyternaut), but not inside the jupyter notebook. and its asking for OPEN_API_KEY. can u pl help ? |
Anything update on this pr for adding ollama to jupyter ai.. |
I'll try to finish this PR soon, and provide a way to configure the list of models (+ docs). |
@jtpio: Thank you for this ollama integration Jeremy! Also look forward to configurable models. Let me know where I can buy you a coffee. |
@siliconberry You can put any value for OPANAI_API_KEY. It is needed for this hack to work but will be ignored by ollama. |
@lalanikarim in answer to @siliconberry question, have you gotten both the chat/jupyternaut as well as notebook cell (ie magic %%ai) working with ollama? I’m struggling now to get notebook working. When I set the OPENAI_API_KEY environment variable to a dummy key (sk-abcd1234), the notebook gives an error indicating that ChatGPT doesn’t accept the key. Jupyternaut chat works fine. It seems like either jupyter_ai’s notebook is not using the config.json like Jupyternaut or ollama does not support all ChatGPT api…? Idk, maybe I’m missing something |
Nevermind @lalanikarim @siliconberry. The magic %%ai does work. Note: The magic doesn’t use the config.json but uses environment vars for key and url. |
Just joining the chorus of folks who would be excited to see this, especially given the rate at which new open-weights models are regularly appearing for ollama (llama3, phi3, wizardlm2). ollama's use of local GPU seems a lot smoother too than gpt4all too. |
I'm so excited for this! 😍😍😍 |
Fixes #482
Ollama seems to be getting popular for running models locally, and looks like it would be good to have in Jupyter AI by default.
OllamaProvider
OllamaEmbeddingsProvider