Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adding support for anthropic, azure, cohere, llama2 #26

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

krrishdholakia
Copy link

@krrishdholakia krrishdholakia commented Aug 8, 2023

Hi @polyrabbit ,

Noticed you're only calling OpenAI. I'm working on litellm (simple library to standardize LLM API Calls - https://github.com/BerriAI/litellm) and was wondering if we could be helpful.

Added support for Claude, Cohere, Azure and Llama2 (via Replicate) by replacing the ChatOpenAI completion call with a litellm completion call. The code is pretty similar to the OpenAI class - as litellm follows the same pattern as the openai-python sdk.

Would love to know if this helps.

Happy to add additional tests / update documentation, if the initial PR looks good to you.

@polyrabbit
Copy link
Owner

Hi, thanks for this wonderful library.

Just one quick question - does it support function calling for other models, or even just OpanAI models? This app relies on JSON response.

@krrishdholakia
Copy link
Author

yes it support function calling - exactly like how openai calls it - https://litellm.readthedocs.io/en/latest/input/

@polyrabbit
Copy link
Owner

Nice! I'll try it later, thanks

@polyrabbit
Copy link
Owner

One difference I found is on the way to set timeout - OpenAI uses timeout parameter whereas litellm uses force_timeout, is it intended?

Could you please also add litellm as a dependency to the requirements.txt file?

@krrishdholakia
Copy link
Author

Hey @polyrabbit i updated the requirements.txt.

re:timeout - i thought that was for the completions endpoint - i don't recall seeing a timeout parameter for ChatCompletions - if you could share any relevant documentation, happy to check it out.

Let me know if there are any remaining blockers for this PR

@polyrabbit
Copy link
Owner

I see it here: https://github.com/openai/openai-python/blob/b82a3f7e4c462a8a10fa445193301a3cefef9a4a/openai/api_resources/chat_completion.py#L21-L28

def create(cls, *args, **kwargs):
    """
    Creates a new chat completion for the provided messages and parameters.

    See https://platform.openai.com/docs/api-reference/chat-completions/create
    for a list of valid parameters.
    """
    start = time.time()
    timeout = kwargs.pop("timeout", None)

So timeout is used in my code, after switching to litellm, the code throws exception: unexpected keyword argument 'timeout'

@krrishdholakia
Copy link
Author

got it - will make a fix for it and update the PR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants