-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pip doesn't respect proxy while installing packages from requirements file #1805
Comments
Is the one that is a direct HTTPS request to PyPI in a |
No, I don't think so. I checked in /tmp/pip_built_root directory. grep'd for 'setup_requires' in all setup.py/cfg files. Nothing. The installation failed as expected. Let me know if there is a different way to verify it. |
Is it a particular package that is reaching out to PyPI? |
Here are the contents of reqs.pip
Installation fails at stripe. If I comment simplejson, it fails at raven. Here is the failure log. Note that, timeout is due to firewall dropping packets directly being sent to pypi server.
|
I'm having the same error. Strangely enough it installs a bunch of packages from the requirements file without problems but then fails with the VerifiedHTTPSConnection Timeout error. |
I couldn't find a solution for this. As a workaround, I used proxychains to enforce transparent proxy. |
It would be awesome if we could get a simplest set of steps to reproduce this |
I run my requirements.txt contains 20 entries, it installs the first 16 without any problems and then throws the error: Interestingly enough it always fails when installing the 16. package ( Increasing the timeout to something like 300 doesn't help. |
I'm facing same issue on CentOS 6.4 with Python 2.6.6 and pip 1.5.6. Any idea how to get a workaround? |
Seems like it is always the 15th item in the requirements file. If I comment out earlier items, it times out on a later item than previously... Tried with pip 1.5.4 and 1.5.6 on Ubuntu 14.04 LTS |
@pcraston could it be some sort of rate limiting on your proxy? With the requirements file pip is sending too many requests in a short period of time? |
Just spoke to our IT manager he says there is no rate limiting on our proxy. But it does seem odd! |
Its doesn't seem to be enforced by proxy. If you monitor the outgoing |
I can confirm I have the same issue. Installing threatstream/mhn with the requirements file. |
This appears to be a requests bug, I've redirected this to upstream here: https://github.com/kennethreitz/requests/issues/2110 |
Or perhaps urllib3. |
Ok requests has kicked this back to us, it appears it might actually be a PyPI/Fastly issue. |
Can folks getting this issue run this script with requests 2.3.0? import requests
session = requests.Session()
for i in range(100):
print(i)
session.get("https://pypi.python.org/simple/").content and report back if it fails and where it fails? Also if you can report back with the name of your proxy software and version or any other information of the like that we can use to try and reproduce. Also if you can go to http://debug.fastly.com/ and post the encrypted block at the top so I can forward it to Fastly that would be great as well. |
Oh, if the above script fails, can you also try it with https://imgur.com/ and https://google.com/ |
Also can you clarify that there is no address based restriction on rate limiting or the like? For example the IP addresses associated with PyPI are also associated with a lot of other sites that Fastly hosts like imgur so could the proxies be denying based on the IP addresses? |
So I ran the little script and after 20sec of displaying content I get the following error :
Our corporate proxy doesn't have rate limiting or address based restrictions. |
Just like above, the script failed on 52nd requests for both pypi and imgur. Google worked fine.
The proxy is configured to refuse in case a direct connection is attempted, hence 111. There is no rate limiting AFAIK. Furthermore, I can confirm that I've faced the same situation on two different proxy deployments - both squid, running version 3.1.10 on centos. Response from debug.fastly.com H4sIAAAAAAAAA31SXU/bMBT9K1GeNlan+WxTKqRloYVsNFSkICZVQm7iJhaJ3dkOTUH8912HaRVi |
It also fails on 52nd request for me on https://pypi.python.org/simple/ and https://imgur.com/ (but with a different error than @shredder12 and @nullprobe -- see below): 50
All works fine when sending requests to https://google.com Proxy is squid version 3.1.20 running on Debian http://debug.fastly.com/ output: H4sIAAAAAAAAA21SXW/aMBT9K1b20nY4xIFAaNUHlgKL1AZE0q8JaXIdN1hLbOo4BVr1v+863cam |
Fails on 52 attempt. As above works fine with https://google.com. Running Ubuntu 12.04 running in host-only mode on VirtualBox, proxying through wingate on Windows 8.1 host. Traceback (most recent call last): H4sIAAAAAAAAA21T70/bMBD9V6x8AhanSZ2kBMSHLsCK+FXRQpHWaXKTI7Wa2Jnjri2o//vOKaOT |
Hi, This seems to happen randomly on big requirements.txt that I try to install with pip when using the https proxy. After investigation, I found that the problem seems to lie in the PoolManager of urllib3 that is embedded in the pip installation (_vendor/requests/packages/urllib3/poolmanager.py:ProxyManager) I found a quick fix in the code that make my installation passes correctly. To confirm what I'm saying, just change this single line in the following file of your pip installation:
The pip install works correctly afterward when instantiating a new ProxyManager object every time. Here is an example of error that I used to have for my pip install:
For the curious, the requirements.txt file I was testing is this one: |
Hey @alexandrem I'm not sure I'm groking what you've actually changed. Was it in |
Yes my test change was in pip/_vendor/requests/adapters.py |
Sorry the diff was reverse, but basically I just add a True to the condition on line 209 of the adapter.py to always create a ProxyManager instance, thus skipping the pool manager logic. |
If everyone who has run @dstufft's script above can try out the following script, it would help us narrow down the problem. First, if you can determine which proxy from requests.packages.urllib3 import ProxyManager
proxy_url = 'http://localhost:8080' # Just an example, place your proxy URL here please
pool = ProxyManager(proxy_url)
for i in range(100):
print(i)
pool.request("GET", "https://pypi.python.org/simple/", headers={'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*'}).read() This is roughly the equivalent to what requests sends. |
This is still an issue in pip version 7.1.2. I am seeing it for index_urls though. For example if you try and download pandas from an index, pandas will require numpy which pip will then try and get from PYPI instead of the specified index_url details are: Python 2.7 |
@AndiEcker, Have you found any solution for that problem? Even I face the same issue from my office. Very much appreciate your help. This is the error:
|
@iamnagesh : the solution for my case was that our network administrator had to gave me access to the internet without a proxy. But your error message is looking different anyway - sorry but don't know why your connection is showing a timeout error - if you have a slow internet connect then maybe try to increase the timeout (unfortunately I don't know how to do this). |
For the previous errors involving http proxy authentication via NTLM, I would suggest using cntlm as a workaround. Setting it locally with a NTLM hash for your proxy password really solves a lot of different issues with apps and libraries that don't support very well corporate http/https proxies. You just have to set your proxy env vars and app settings to http://localhost:3128 |
@iamnagesh I am facing the same error while installing a package through pip in anaconda prompt. |
@ashi-taka: in my case our network admins fixed it for me and AFAIR the fix 2016-04-06 6:00 GMT+01:00 ashi-taka notifications@github.com:
|
Though the proxy confiuration was added at system level, due to some reason pip was was giving the error [Errno 101] Network is unreachable Using proxy option fixed the issue.( thanks to @pcraston )
Ubuntu 14.04 |
Running pip with --isolated key solved the problem for me. |
Got the same errors with pip 9.0.1 @ windows 10 |
[update] It worked! I've found the line, it's now 178. Thanks. Hi all C:\Users\Lenovo Local>pip install mkdocs Python 3.5.3 |
Hi, I am not able to install lightgbm in spyder. I get the followinf error...would appreciate any help on how to fix this. Yes, there is a proxy at work. Thanks..! Retrying (Retry(total=4, connect=None, read=None, redirect=None)) after connection broken by 'NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x000002B1188BDB38>: Failed to establish a new connection: [Errno 11002] getaddrinfo failed',)': /simple/lightgbm/ |
Well, seems you got DNS trouble, which pip cannot do anything about. Also, try to avoid appending to closed tickets, that always makes things intransparent. |
pip install -U requirement --proxy=my_proxy:port it worked!! |
Hi Senani, did not work for me. Did I code correctly? Let's say I need to download implicit modile. pip install implicit --proxy=sm_proxy:xx28 Also I am behind a pacs firewall. Each time to use internet, I need to enter logina and password. Thanks a ton. |
So I had a similar issue to this where it was only failing on pip installs from git repo. This only failed when i was behind a proxy. I was not able to get it resolved for git installs over http. I ended up changing my install to use ssh which worked for my environment. Works behind proxy
Doesn't work behind proxy
|
Below pip command worked for me. sudo pip --proxy http://proxy_username:proxy_password@proxyhost:port install <package_to_be_install> |
In my case problem was with |
the --isolated flag worked for me when i'm not behind a proxy but the Cannot connect to Proxy error keeps coming up! |
Just try the below for example |
In my case both solutions work:
|
I already had http_proxy and https_proxy set, but was getting connection failures. Simply setting up HTTP_PROXY and HTTPS_PROXY worked for me. |
I was able to resolve the issue by increasing resources on my proxy server. Initially, my proxy server was running 2 "servers" and allowed 5 "client" connections. After increasing to 10 servers and 20 client connections, pip installed packages perfectly. |
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
I'm using pip 1.5.5 on python 2.7.6, CentOS 6.5 x86_64. Python is compiled and installed in /usr/local/. While trying to install packages from requirements file, the first 12 packages are being downloaded via proxy, but after that pip tries to directly connect to the server instead of proxy and request fails.
$ export https_proxy=http://proxy:8080
$ /usr/local/bin/pip2.7 install -r reqs.txt
This results in download of first 12 packages via proxy, but the next one is a direct HTTPS request to the pypi server. I've validated this via packet trace and proxy logs.
Note that, the installation works flawlessly using /usr/bin/pip, v1.3.1.
Thoughts?
The text was updated successfully, but these errors were encountered: