Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using Mistral through Jan.ai with this extension #32

Open
saleh-mir opened this issue Jan 22, 2024 · 1 comment
Open

Using Mistral through Jan.ai with this extension #32

saleh-mir opened this issue Jan 22, 2024 · 1 comment

Comments

@saleh-mir
Copy link

saleh-mir commented Jan 22, 2024

I recently came across this library called Jan.ai, which allows you to run offline models such as Mistral Instruct, which is really powerful. Apparently, this tool has the same API as OpenAI.

Is it possible to use it with this extension for Raycast? If so, is there any resources I could read about this? And if you need my help, please do tell.

This is an example of how I can access the model using their API which is the same as OpenAI's:

curl http://localhost:1337/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer EMPTY" \
  -d '{
     "model": "mistral-ins-7b-q4",
     "messages": [{"role": "user", "content": "Say this is a test!"}],
     "temperature": 0.7
   }'

Reply:

{"choices":[{"finish_reason":null,"index":0,"message":{"content":" I understand that you have asked me to say that \"this is a test.\" Here is that statement for you: \"This is a test.\" Is there anything specific you would like me to do with this test, or is it simply for my understanding that we are conducting a test? Let me know if there's anything else I can assist you with.","role":"assistant"}}],"created":1705970517,"id":"tWF3q25T3yaeF96K2bVF","model":"_","object":"chat.completion","system_fingerprint":"_","usage":{"completion_tokens":72,"prompt_tokens":14,"total_tokens":86}}% 
@SKaplanOfficial
Copy link
Owner

If the API is exactly the same, you can use the OpenAI API Example in the readme, just substituting the endpoint with your localhost one (i.e. http://localhost:1337/v1/chat/completions). You can leave the API key field blank.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants