Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: GPTCache Server and Libraries Failed for openai==1.0.0 #570

Open
ephemeral2eternity opened this issue Nov 19, 2023 · 2 comments
Open

Comments

@ephemeral2eternity
Copy link

Current Behavior

The following command failed when running gptcache server.

$ gptcache_server -s 127.0.0.1 -p 8000

Or

$ docker pull zilliz/gptcache:latest
$ docker run -p 8000:8000 -it zilliz/gptcache:latest

The errors are shown below.

successfully installed package: openai
Traceback (most recent call last):
  File "/usr/local/bin/gptcache_server", line 5, in <module>
    from gptcache_server.server import main
  File "/usr/local/lib/python3.8/site-packages/gptcache_server/server.py", line 8, in <module>
    from gptcache.adapter import openai
  File "/usr/local/lib/python3.8/site-packages/gptcache/adapter/openai.py", line 31, in <module>
    class ChatCompletion(openai.ChatCompletion, BaseCacheLLM):
  File "/usr/local/lib/python3.8/site-packages/openai/_utils/_proxy.py", line 22, in __getattr__
    return getattr(self.__get_proxied__(), attr)
  File "/usr/local/lib/python3.8/site-packages/openai/_utils/_proxy.py", line 43, in __get_proxied__
    return self.__load__()
  File "/usr/local/lib/python3.8/site-packages/openai/lib/_old_api.py", line 33, in __load__
    raise APIRemovedInV1(symbol=self._symbol)
openai.lib._old_api.APIRemovedInV1:

You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.

You can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface.

Alternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`

A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742

Expected Behavior

> gptcache_server -s 127.0.0.1 -p 8000
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
INFO:     Started server process [8545]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     127.0.0.1:61051 - "POST /get HTTP/1.1" 200 OK

Steps To Reproduce

1. docker pull zilliz/gptcache:latest
2. docker run -p 8000:8000 -it zilliz/gptcache:latest

Environment

No response

Anything else?

No response

@SimFG
Copy link
Collaborator

SimFG commented Jan 8, 2024

same issue: #576

@yudhiesh
Copy link

yudhiesh commented Feb 1, 2024

Any plans to fix this? The server is completely usable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants