Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhancement]: How gptcache can better adapt to openai 1.x #613

Open
SimFG opened this issue Mar 8, 2024 · 1 comment
Open

[Enhancement]: How gptcache can better adapt to openai 1.x #613

SimFG opened this issue Mar 8, 2024 · 1 comment

Comments

@SimFG
Copy link
Collaborator

SimFG commented Mar 8, 2024

What would you like to be added?

Before openai 1.x, the interface was in the form of a static method of the class, such as openai.ChatCompletion.create. But in openai 1.x, objects are used, like:

from openai import OpenAI
client = OpenAI(
    # Defaults to os.environ.get("OPENAI_API_KEY")
)

chat_completion = client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages=[{"role": "user", "content": "Hello world"}]
)

So now there is no way to use the previous method of simply replacing the package name to achieve seamless access to gptcache. At present, the way I can think of is to proxy the relevant interfaces of openai through methods, such as:

def cache_openai_chat_complete(client: OpenAI, **openai_kwargs: Any):
    pass

Why is this needed?

No response

Anything else?

No response

@SimFG
Copy link
Collaborator Author

SimFG commented Mar 13, 2024

If someone has better ideas, suggestions are welcome.
I have opened the pr: #614.
I don’t merge the pr and bump the new version. I actually want to hear more people’s suggestions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant