is there any possibility to create local proxy server for apis like gemini,groq? because right now i see that option for local models only #2288
Closed
HakaishinShwet
started this conversation in
General
Replies: 2 comments
-
yea - https://docs.litellm.ai/docs/providers all of those are supported. anything missing in docs? |
Beta Was this translation helpful? Give feedback.
0 replies
-
i resolved this issue in discord closing now :-)) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
i am asking for this is because i wanted to connect groq with flowise and currently there was no option to integrate so i thought if i create local proxy server and use local ai functionality of it to connect with groq and provide api key and chat using groq so thought to ask.
Beta Was this translation helpful? Give feedback.
All reactions