-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
phi3 #262
Comments
@francescoagati , we have intergration with phi3. What issues are you having? We are super active on discord: Happy to help you get this figured out |
in many case using ollama with phi3 dont work give empty response |
with function calling tools sometimes dont call tools |
but now i have that using pydantic work gookd
|
@francescoagati a large part of the problem is the LLM not being good enough to return the format correctly. We see this behavior more with running Llama3, Llama2, phi3 on Ollama. We see it less when you run these models on Groq. The issue is that Ollama LLM doesn't always return back the json function calling in the correct format i.e. it appends text before the function calls ("here is the json output: {}"). Let me do some digging and get back to you. |
|
Hey @francescoagati phi3 is a 3.8B parameter model. It is great for conversation but not so much for function calling or even RAG. You mentioned that phi3 has good support for function calling. Can you please share some examples of that so that we can look into it? Thanks |
phi3 have a good support for function calling but dont work with phidata
The text was updated successfully, but these errors were encountered: