Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Relevance via LiteLLM? #26

Open
krrishdholakia opened this issue Sep 17, 2023 · 5 comments
Open

Relevance via LiteLLM? #26

krrishdholakia opened this issue Sep 17, 2023 · 5 comments
Assignees
Labels
LLM Direct use of LLMs question Further information is requested

Comments

@krrishdholakia
Copy link

Hi @deadbits,

thanks for using litellm! Curious how you're thinking of using it in this case?

@deadbits
Copy link
Owner

Hey 👋 thanks for reaching out. It isn’t fully implemented right now (I expect to finish it this week), but the plan is to use LiteLLM to make calls to a user defined LLM. LiteLLM seemed like the best approach to let users pick which model they want to use without me needing to write wrappers for each.

There’s the start of the LiteLLM code here, pretty minimal so far https://github.com/deadbits/vigil-llm/blob/main/vigil/llm.py
Which gets used here https://github.com/deadbits/vigil-llm/blob/main/vigil/scanners/relevance.py

The issue I’m trying to resolve now is how to best get structured output out of LLMs in general, and how I can add that in with LiteLLM (Guidance looked good but I don’t think I can use it with LiteLLM? I could be mistaken)

@deadbits deadbits self-assigned this Sep 17, 2023
@deadbits deadbits added question Further information is requested LLM Direct use of LLMs labels Sep 17, 2023
@krrishdholakia
Copy link
Author

We have an open PR awaiting approval - guidance-ai/guidance#347

If you could add a comment there, i think that'd help

@krrishdholakia
Copy link
Author

Do you need litellm if you use guidance, i think they support a bunch of different providers already.

Any specific ones you're trying to use us for?

@deadbits
Copy link
Owner

Awesome, I'll keep an eye on that Guidance PR. I don't need LiteLLM if I use Guidance, but I was hoping I could use Guidance + LiteLLM to get the structured output from Guidance and LLM model availability from LiteLLM. Unfortunately that doesn't seem to be the case yet!

I'm going to stick with LiteLLM, probably just use Guidance for some test runs to see what their prompts look like and apply them manually via LiteLLM 😄

@krrishdholakia
Copy link
Author

krrishdholakia commented Sep 25, 2023

@deadbits trying to improve litellm. Can you chat for ~10 minutes this week?

Want to understand how you found the integration experience, and what we could improve on.

Also DM'ed on Linkedin if that helps.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
LLM Direct use of LLMs question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants