New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Requesting model orca-2-7b
raises llm.UnknownModelError
#337
Comments
Works as expected to call the model directly through llm's
|
and
|
When initializing a |
Yes I only saw OpenAI GPT entries in that mapping.
It's the exact same |
It appears the error was introduced here 706ab63#diff-d63276a5c9a6f76f8aa13cf0de45fe92c7b8abe4a329fe2d0e171d771ddce172R11 |
Issue c/o Bart Kleijngeld:
Using the
llm
package to callorca-2-7b
directly works as expected:But attempting an extraction with ontogpt fails:
Same version of
llm
in both cases:The text was updated successfully, but these errors were encountered: