Replies: 3 comments 4 replies
-
So that is already enabled via Oobabooga. Ooba can load many different LLMs and you can access them via API. We've got a couple people who have gotten it working with Vicuna already. One issue with open source LLMs is that returns are not usually as good. Expect failures on tasks. The weaker the model, the more failures you will get. |
Beta Was this translation helpful? Give feedback.
-
I will be working on this later today. I’ve been on vacation so I’ve been
out. I’ll clone the updated repo and get it running locally again. When
that happens I’ll submit a request for you to review.
…On Sun, Apr 23, 2023 at 9:30 AM DataBassGit ***@***.***> wrote:
So that is already enabled via Oobabooga. Ooba can load many different
LLMs and you can access them via API. We've got a couple people who have
gotten it working with Vicuna already.
One issue with open source LLMs is that returns are not usually as good.
Expect failures on tasks. The weaker the model, the more failures you will
get.
—
Reply to this email directly, view it on GitHub
<#24 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AEFV5V4KYRP5FCRTVWB3FF3XCU4GXANCNFSM6AAAAAAXINTQTI>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***
com>
|
Beta Was this translation helpful? Give feedback.
-
An option would be to use LiteLLM, allowing the use of 100+ models across providers. Would love to test out using ollama with this. |
Beta Was this translation helpful? Give feedback.
-
Heya, just saying it would be SUPER dope if we were allowed to use other LLM's than just OpenAI, like the opensource Alpaca and things like that :) I know a lot of other stuff is probably being worked on, but just saying', that would be super cool :)
Beta Was this translation helpful? Give feedback.
All reactions