-
Notifications
You must be signed in to change notification settings - Fork 228
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use deepseek #34
Comments
More error info:
|
Hi, I also encountered the same problem in Python3.9. Since the error message is about request URL, I tried to comment out
After the modification, it seems that litellm can connect to deepseek now, but it complains that the model size is too big:
I would like to know how to resolve the issue, too. |
I met the same error : #34 (comment) @huanhuan6666 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
When I modified the configuration file to use the model="deepseek-coder-33b-instruct" and ran the code for model inference, it failed with the following error:
It seems that litellm does not support deepseek, and I would like to know how to resolve this issue.
The text was updated successfully, but these errors were encountered: