Skip to content

Ollama llava proxy_server_config.yaml and curl request #1864

Answered by Lunik
fatihyildizhan asked this question in Q&A
Discussion options

You must be logged in to vote

Here is the call I have made to LiteLLM API and is working since #2201

curl "http://127.0.0.1:8000/v1/chat/completions" \
  -X POST \
  -H "Content-Type: application/json" \
  -d '{
    "model": "ollama/llava",
    "messages": [
      { "role": "user", "content": [
        { "type": "text", "text": "Whats in this image?" },
        { "type": "image_url", "image_url": { "url": "iVBORw...SuQmCC" } }
      ]}
    ]
  }'

Here is the response from LiteLLM

{"id":"chatcmpl-1b971af2-77f5-47a8-b06e-7465ae01251a","choices":[{"finish_reason":"stop","index":0,"message":{"content":" The image you've provided is a playful and cute illustration of an animated character that looks like an anthropomorphic…

Replies: 6 comments 3 replies

Comment options

You must be logged in to vote
1 reply
@fatihyildizhan
Comment options

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@Lunik
Comment options

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@fatihyildizhan
Comment options

Answer selected by fatihyildizhan
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants