Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Requesting model orca-2-7b raises llm.UnknownModelError #337

Open
caufieldjh opened this issue Feb 26, 2024 · 5 comments
Open

Requesting model orca-2-7b raises llm.UnknownModelError #337

caufieldjh opened this issue Feb 26, 2024 · 5 comments

Comments

@caufieldjh
Copy link
Member

Issue c/o Bart Kleijngeld:

Using the llm package to call orca-2-7b directly works as expected:

$ llm -m orca-2-7b "Hi!"
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3.83G/3.83G [05:58<00:00, 10.7MiB/s]
Hello! How can I help you today?

But attempting an extraction with ontogpt fails:

$ ontogpt extract -t drug -i example.txt -m ORCA_2_7B
Traceback (most recent call last):
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/llm/__init__.py", line 148, in get_model
    return aliases[name]
KeyError: 'orca-2-7b'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/harry/ontogpt/.venv/bin/ontogpt", line 6, in <module>
    sys.exit(main())
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/home/harry/ontogpt/src/ontogpt/cli.py", line 324, in extract
    ke = SPIRESEngine(
  File "<string>", line 23, in __init__
  File "/home/harry/ontogpt/src/ontogpt/engines/knowledge_engine.py", line 184, in __post_init__
    self.set_up_client(model_source=self.model_source)
  File "/home/harry/ontogpt/src/ontogpt/engines/knowledge_engine.py", line 603, in set_up_client
    self.client = GPT4AllClient(model=self.model)
  File "<string>", line 8, in __init__
  File "/home/harry/ontogpt/src/ontogpt/clients/gpt4all_client.py", line 37, in __post_init__
    self.local_model = llm.get_model(self.model)
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/llm/__init__.py", line 150, in get_model
    raise UnknownModelError("Unknown model: " + name)
llm.UnknownModelError: 'Unknown model: orca-2-7b'

Same version of llm in both cases:

$ llm --version
llm, version 0.12
$ poetry show | grep llm
llm                               0.12            A CLI utility and Python ...
llm-gpt4all                       0.2             Plugin for LLM adding sup...
@caufieldjh
Copy link
Member Author

Works as expected to call the model directly through llm's get_model:

$ python
Python 3.10.13 (main, Aug 25 2023, 13:20:03) [GCC 9.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from llm import get_model
>>> local_model = get_model("3.5")
>>> local_model
<Model 'gpt-3.5-turbo'>
>>> local_model = get_model("orca-2-7b")
>>> local_model
<Model 'orca-2-7b'>

@caufieldjh
Copy link
Member Author

and orca-2-7b is in the llm aliases:

$ python
...
>>> from llm import get_model_aliases
>>> aliases = get_model_aliases()
>>> "orca-2-7b" in aliases
True

@caufieldjh
Copy link
Member Author

When initializing a GPT4AllClient, though, a call to llm.get_model_aliases() shows an entirely different list of alias mappings.

@bartkl
Copy link

bartkl commented Feb 27, 2024

When initializing a GPT4AllClient, though, a call to llm.get_model_aliases() shows an entirely different list of alias mappings.

Yes I only saw OpenAI GPT entries in that mapping.

Just to check, are the version of llm you're calling from the CLI and the package ontogpt is using (i.e., the one in its virtualenv) the same version? [Slack]

It's the exact same llm within the same virtual env, I don't have any other installed.

@durabledata
Copy link

It appears the error was introduced here 706ab63#diff-d63276a5c9a6f76f8aa13cf0de45fe92c7b8abe4a329fe2d0e171d771ddce172R11

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants