You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
Please do not modify this template :) and fill in all the required fields.
Dify version
0.6.8
Cloud or Self Hosted
Cloud
Steps to reproduce
I use OpenAI-API-Compatible to add my local Lite LLM under Base_URL = http://localhost:4000 ; LiteLLM API key, Model_name (masked, proxy) by LiteLLM. The function of LiteLLM worked well with curl and python test method. But I cannot add local LiteLLM into Dify.
✔️ Expected Behavior
I expect LiteLLM can work well with Dify.
❌ Actual Behavior
The error: An error occurred during credentials validation: HTTPConnectionPool(host='localhost', port=4000): Max retries exceeded with url: /chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f966023a4d0>: Failed to establish a new connection: [Errno 111] Connection refused'))
The text was updated successfully, but these errors were encountered:
To resolve the issue where Dify cannot connect to LiteLLM running on localhost, follow these steps:
Change LiteLLM Binding: Ensure LiteLLM is bound to 0.0.0.0 instead of localhost to make it accessible over the network.
Update Base URL in Dify: If Dify is running in Docker and LiteLLM is on the host, use http://host.docker.internal:4000 as the Base URL instead of localhost.
Test Network Accessibility: From within the Dify Docker container, use curl to test connectivity to the LiteLLM service.
Restart Both Services: After making these adjustments, restart LiteLLM and Dify.
This approach addresses the common issue where Docker containers cannot directly access services on the host using localhost. Adjusting the service binding and using host.docker.internal for Docker setups should help Dify connect to LiteLLM.
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
Self Checks
Dify version
0.6.8
Cloud or Self Hosted
Cloud
Steps to reproduce
I use OpenAI-API-Compatible to add my local Lite LLM under Base_URL = http://localhost:4000 ; LiteLLM API key, Model_name (masked, proxy) by LiteLLM. The function of LiteLLM worked well with curl and python test method. But I cannot add local LiteLLM into Dify.
✔️ Expected Behavior
I expect LiteLLM can work well with Dify.
❌ Actual Behavior
The error: An error occurred during credentials validation: HTTPConnectionPool(host='localhost', port=4000): Max retries exceeded with url: /chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f966023a4d0>: Failed to establish a new connection: [Errno 111] Connection refused'))
The text was updated successfully, but these errors were encountered: