-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: When using llamaIndex to access a Ollama local LLM service in windows, there will be error . #13526
Comments
Hey there @cnrevol! 👋 I'm Dosu, your friendly neighborhood bot, here to lend a hand with bugs, answer your queries, and guide you through contributions while we wait for a human to join the convo. Just wanted to say hi and let you know I'm diving into your issue with the LlamaIndex and the Ollama local LLM service. Hang tight, I'll get back to you with some insights shortly! |
I don't think this is a llamaindex issue. More of an ollama/windows issue? You can try changing the base_url in the class to point to a more windows friendly |
Changing the Your initial solution of creating a subclass and setting In summary, adjusting the
|
Thank you,@dosu,@logan-markewich I tried modifying base_url to http://127.0.0.1, but it seems I still encounter the same error, Is there something else I don't understand? Any errors? Or could this indeed be a Windows issue? I need to further investigate the differences in Windows proxy settings control for external IPs, local IPs, and localhost... Thank you. |
If you're still encountering a Here are some focused steps to troubleshoot and potentially resolve the issue:
It's also important to ensure that any application-specific configurations, such as those for interacting with local LLM services, are correctly set up. If after checking these areas the issue persists, providing more information about your setup, including specific error messages from your application logs and any changes made to default settings, would be helpful for further diagnosis.
|
Bug Description
In a Windows environment, when the OS system proxy is set, using llamaIndex to access the local Ollama environment LLM , It will results in an access error.
Httpx.HTTPStatusError: Client error '403 Forbidden' for url 'http://localhost:11434/api/chat'
Version
0.10.33
Steps to Reproduce
When using llamaIndex to access a local LLM, I encountered a minor issue, and I found a solution for it.
Problem Description:
In a Windows environment, when the OS system proxy is set, using llamaIndex to access the local environment LLM (such as a local LLM running on Ollama) results in an access error.
Reason:
llamaIndex uses httpx to access the LLM's URL.
The httpx library in Python uses the system proxy settings by default to access the LLM's web URL.
Therefore, the Windows system defaults to using the proxy to access the localhost service provided by Ollama, which causes an error.
Solution:
I solved this by creating a subclass that inherits from the CustomLLM class (llma_index.core.llms.custom.CustomLLM),
and added [trust_env=false] to the parameters when calling httpx.client.
There might be better solutions available.
I hope that when encapsulating and inheriting the CustomLLM subclass, [trust_env=false] can be included as an initialization parameter or retained during the call, which also seems to solve this issue.
Relevant Logs/Tracbacks
No response
The text was updated successfully, but these errors were encountered: