Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New resolver takes a very long time to complete #9187

Closed
nijel opened this issue Nov 30, 2020 · 192 comments
Closed

New resolver takes a very long time to complete #9187

nijel opened this issue Nov 30, 2020 · 192 comments
Labels
kind: crash For situations where pip crashes

Comments

@nijel
Copy link

nijel commented Nov 30, 2020

What did you want to do?

One of CI jobs for Weblate is to install minimal versions of dependencies. We use requirements-builder to generate the minimal version requirements from the ranges we use normally.

The pip install -r requirements-min.txt command seems to loop infinitely after some time. This started to happen with 20.3, before it worked just fine.

Output

Requirement already satisfied: google-auth<2.0dev,>=1.21.1 in /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages (from google-api-core[grpc]<2.0.0dev,>=1.22.0->google-cloud-translate==3.0.0->-r requirements-min.txt (line 63)) (1.23.0)
Requirement already satisfied: pytz>dev in /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages (from celery[redis]==4.4.5->-r requirements-min.txt (line 3)) (2020.4)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.6.0 in /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages (from google-api-core[grpc]<2.0.0dev,>=1.22.0->google-cloud-translate==3.0.0->-r requirements-min.txt (line 63)) (1.52.0)
Requirement already satisfied: six>=1.9.0 in /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages (from bleach==3.1.1->-r requirements-min.txt (line 1)) (1.15.0)
Requirement already satisfied: protobuf>=3.12.0 in /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages (from google-api-core[grpc]<2.0.0dev,>=1.22.0->google-cloud-translate==3.0.0->-r requirements-min.txt (line 63)) (3.14.0)
Requirement already satisfied: grpcio<2.0dev,>=1.29.0 in /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages (from google-api-core[grpc]<2.0.0dev,>=1.22.0->google-cloud-translate==3.0.0->-r requirements-min.txt (line 63)) (1.33.2)
Requirement already satisfied: google-auth<2.0dev,>=1.21.1 in /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages (from google-api-core[grpc]<2.0.0dev,>=1.22.0->google-cloud-translate==3.0.0->-r requirements-min.txt (line 63)) (1.23.0)
Requirement already satisfied: pytz>dev in /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages (from celery[redis]==4.4.5->-r requirements-min.txt (line 3)) (2020.4)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.6.0 in /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages (from google-api-core[grpc]<2.0.0dev,>=1.22.0->google-cloud-translate==3.0.0->-r requirements-min.txt (line 63)) (1.52.0)
Requirement already satisfied: six>=1.9.0 in /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages (from bleach==3.1.1->-r requirements-min.txt (line 1)) (1.15.0)
Requirement already satisfied: protobuf>=3.12.0 in /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages (from google-api-core[grpc]<2.0.0dev,>=1.22.0->google-cloud-translate==3.0.0->-r requirements-min.txt (line 63)) (3.14.0)

This seems to repeat forever (well for 3 hours so far, see https://github.com/WeblateOrg/weblate/runs/1474960864?check_suite_focus=true)

Additional information

Requirements file triggering this: requirements-min.txt

It takes quite some time until it gets to above loop. There is most likely something problematic in the dependencies set...

@dstufft
Copy link
Member

dstufft commented Nov 30, 2020

I'm going to use this issue to centralize incoming reports of situations that seemingly run for a long time, instead of having each one end up in it's own issue or scattered around.

@dstufft
Copy link
Member

dstufft commented Nov 30, 2020

@jcrist said in #8664 (comment)

Note: I was urged to comment here about our experience from twitter.

We (prefect) are a bit late on testing the new resolver (only getting around to it with the 20.3 release). We're finding that install times are now in the 20+ min range (I've actually never had one finish), previously this was at most a minute or two. The issue here seems to be in the large search space (prefect has loads of optional dependencies, for CI and some docker images we install all of them) coupled with backtracking.

I enabled verbose logs to try to figure out what the offending package(s) were but wasn't able to make much sense of them. I'm seeing a lot of retries for some dependencies with different versions of setuptools, as well as different versions of boto3. For our CI/docker builds we can add constraints to speed things up (as suggested here), but we're reluctant to increase constraints in our setup.py as we don't want to overconstrain downstream users. At the same time, we have plenty of novice users who are used to doing pip install prefect[all_extras] - telling them they need to add additional constraints to make this complete in a reasonable amount of time seems unpleasant. I'm not sure what the best path forward here is.

I've uploaded verbose logs from one run here (killed after several minutes of backtracking). If people want to try this themselves, you can run:

pip install "git+https://github.com/PrefectHQ/prefect.git#egg=prefect[all_extras]"

Any advice here would be helpful - for now we're pinning pip to 20.2.4, but we'd like to upgrade once we've figured out a solution to the above. Happy to provide more logs or try out suggestions as needed.

Thanks for all y'all do on pip and pypa!

@dstufft dstufft changed the title Runs for hours, maybe in an infinite loop New resolver in 20.3 takes an extremely long time to resolve dependencies Nov 30, 2020
@dstufft dstufft changed the title New resolver in 20.3 takes an extremely long time to resolve dependencies New resolver in 20.3 takes a very long time to complete Nov 30, 2020
@dstufft dstufft changed the title New resolver in 20.3 takes a very long time to complete New resolver takes a very long time to complete Nov 30, 2020
@dstufft
Copy link
Member

dstufft commented Nov 30, 2020

These might end up being resolved by #9185

@dstufft dstufft pinned this issue Nov 30, 2020
@brainwane
Copy link
Contributor

Thanks, @dstufft.

I'll mention here some useful workaround tips from the documentation -- in particular, the first and third points may be helpful to folks who end up here:

  • If pip is taking longer to install packages, read Dependency resolution backtracking for ways to reduce the time pip spends backtracking due to dependency conflicts.

  • If you don’t want pip to actually resolve dependencies, use the --no-deps option. This is useful when you have a set of package versions that work together in reality, even though their metadata says that they conflict. For guidance on a long-term fix, read Fixing conflicting dependencies.

  • If you run into resolution errors and need a workaround while you’re fixing their root causes, you can choose the old resolver behavior using the flag --use-deprecated=legacy-resolver. This will work until we release pip 21.0 (see Deprecation timeline).

@nijel
Copy link
Author

nijel commented Nov 30, 2020

For my case, the problematic behavior can be reproduced much faster with pip install 'google-cloud-translate==3.0.0' 'requests==2.20.0' 'setuptools==36.0.1', so it sounds like #9185 might improve it.

The legacy resolver bails out on this quickly with: google-auth 1.23.0 requires setuptools>=40.3.0, but you'll have setuptools 36.0.1 which is incompatible..

@pradyunsg
Copy link
Member

One other idea toward this is, stopping after 100 backtracks (or something) with a message saying "hey, pip is backtracking due to conflicts on $package a lot".

@dstufft
Copy link
Member

dstufft commented Nov 30, 2020

I wonder how much time is taken up by downloading and unzipping versus actually taking place in the resolver iteself?

@pradyunsg
Copy link
Member

I wonder how much time is taken up by downloading and unzipping versus actually taking place in the resolver iteself?

Most of it, last I checked. Unless we're hitting some very bad graph situation, in which case... 🤷 the users are better off giving pip the pins.

@tedivm
Copy link

tedivm commented Dec 1, 2020

I'm having our staff fill out that google form where ever they can, but I just want to mention that pretty much all of our builds are experiencing issues with this. Things that worked fine and had a build time of about 90 seconds are now timing out our CI builds. In theory we could increase the timeout, but we're paying for these machines by the minute so having all of our builds take a huge amount of time longer is a painful choice. We've switched over to enforcing the legacy resolver on all of our builds for now.

@pradyunsg
Copy link
Member

As a general note to users reaching this page, please read https://pip.pypa.io/en/stable/user_guide/#dependency-resolution-backtracking.

@tedivm
Copy link

tedivm commented Dec 1, 2020

I was asked to add some more details from twitter, so here are some additional thoughts. Right now the four solutions to this problem are:

  1. Just wait for it to finish
  2. Use trial and error methods to reduce versions checked using constraints
  3. Record and reuse those trial error methods in a new "constraints.txt" file
  4. Reduce the number of supported versions "during development"

Waiting it out is literally too expensive to consider

This solution seems to rely on downloading an epic ton of packages. In the era of cloud this means-

  • Larger harddrives are needed to store the additional packages
  • More bandwidth is consumed downloading these packages
  • It takes longer to process everything due to the need to decompress these images

These all cost money, although the exact balance will depend on the packages (people struggling with a beast like tensorflow might choke on the hard drive and bandwidth, while people with smaller packages just get billed for the build time).

What's even more expensive is the developer time wasted during an operation that used to take (literally) 90s that now takes over 20 minutes (it might take longer but it times out on our CI systems).

We literally can't afford to use this dependency resolution system.

Trial and error constraints are extremely burdensome

This adds a whole new set of processes to everyone's dev cycle where not only do they have to do the normal dev work, but now they need to optimize the black box of this resolver. Even the advice on the page is extremely trial and error, basically saying to start with the first package giving you trouble and continue iterating until your build times are reasonable.

Adding more config files complicates and already overcomplicated ecosystem.

Right now we already have to navigate the differences between setup.py, requirements.txt, setup.cfg, pyproject.toml , and now adding in constraints.txt just adds even more burden (and confusion) on maintaining python packages.

Reducing versions checked during development doesn't scale

Restricting versions during development but releasing the package without those constraints means that the users of that package are going to have to reinvent those constraints themselves during development. If I install a popular package my build times could explode until I duplicate their efforts. There's no way to share those constraints other than copy/paste methods, which adds to the maintenance burden.

What this is ultimately going to result in is people not using constraints at all, instead limiting the dependency versions directly based not off of actual compatibility but a mix of compatibility and build times. This will make it harder to support smaller packages in the long term.

@dstufft
Copy link
Member

dstufft commented Dec 1, 2020

Most of it, last I checked.

Might be a good reason to prioritize pypi/warehouse#8254

@pfmoore
Copy link
Member

pfmoore commented Dec 1, 2020

Might be a good reason to prioritize pypi/warehouse#8254

Definitely. And a sdist equivalent when PEP 643 is approved and implemented.

This solution seems to rely on downloading an epic ton of packages

It doesn't directly rely on downloading, but it does rely on knowing the metadata for packages, and for various historical reasons, the only way to get that data is currently by downloading (and in the case of source distributions, building) the package.

That is a huge overhead, although pip's download cache helps a lot here (maybe you could persist pip's cache in your CI setup?) On the plus side, it only hits hard in cases where there are a lot of dependency restrictions (where the "obvious" choice of the latest version of a package is blocked by a dependency from another package), and it's only tended to be really significant in cases where there is no valid solution anyway (although this is not always immediately obvious - the old resolver would happily install invalid sets of packages, so the issue looks like "old resolver worked, new one fails" where it's actually "old one silently broke stuff, new one fails to install instead").

This doesn't help you address the issue, I know, but hopefully it gives some background as to why the new resolver is behaving as it is.

@pradyunsg
Copy link
Member

pradyunsg commented Dec 1, 2020

@tedivm please look into using pip-tools to perform dependency resolution as a separate step from deployment. It's essentially point 4 -- "local" dependency resolution with the deployment only seeing pinned versions.

@pradyunsg pradyunsg added the kind: crash For situations where pip crashes label Dec 1, 2020
@dstufft
Copy link
Member

dstufft commented Dec 1, 2020

Actually, It would be an interesting experiment to see. These pathological cases that people. are experiemnting with, if they let the resolver complete once, persist the cache, and then try again, is it faster? If it's still hours long even with a cache, then that suggests pypi/warehouse#8254 isn't going to help much.

I don't know what we're doing now exactly, but I also wonder if it would make sense to stop exhaustively searching the versions after a certain point. This would basically be a trade off of saying that we're going to start making assumptions about how dependencies evolve over time. I assume we're currently basically starting with the latest version, and iterating backwards one version at a time, is that correct? If so, what if we did something like:

  1. Iterate backwards one version at a time until we fail resolution X times.
  2. Start a binary search, cut the remaining candidates in half and try with that.
    2a. If it works, start taking the binary search towards the "newer" side (cut that in half, try again, etc).
    2b. If it fails, start taking the binary search towards the "older"side (cut that in half, try again, etc).

This isn't exactly the correct use of a binary search, because the list of versions aren't really "sorted" in that way, but it would kind of function similiarly to git bisect? The biggest problem with it is it will skip over good versions if the latest N versions all fail, and the older half of versions all fail, but the middle half are "OK".

Another possible idea is instead of a binary search, do a similar idea but instead of bucketing the version set in halves, try to bucket them into buckets that match their version "cardinality". IOW, if this has a lot of major versions, bucket them by major version, if it has few major versions, but a lot of minor versions, bucket it by that, etc. So that you divide up the problem space, then start iterating backwards trying the first (or the last?) version in each bucket until you find one that works, then constraint the solver to just that bucket (and maybe one bucket newer if if you're testing the last version instead of first?).

I dunno, it seems like exhaustively searching the space is the "correct" thing to do if you want to always come up with the answer if one exists anywhere, but if we can't make that fast enough, even with changes to warehouse etc, we could probably try to be smart about using heuristics to narrow the search space, under the assumption that version ranges typically don't change that often and when they do, they don't often change every single release.

Maybe if we go into heuristics mode, we emit a warning that we're doing it, suggest people provide more information to the solver, etc. Maybe provide a flag like --please-be-exhaustive-its-ok-ill-wait to disable the heuristics.

Maybe we're already doing this and I'm jsut dumb :)

@pfmoore
Copy link
Member

pfmoore commented Dec 1, 2020

We're not doing it, and you're not dumb :-) But it's pretty hard to do stuff like that - most resolution algorithms I've seen are based on the assumption that getting dependency data is cheap (many aren't even usable by pip because they assume all dependency info is available from the start). So we're getting into "designing new algorithms for well-known hard CS problems" territory :-(

@uranusjr
Copy link
Member

uranusjr commented Dec 1, 2020

Another possible idea is instead of a binary search, do a similar idea but instead of bucketing the version set in halves, try to bucket them into buckets that match their version "cardinality". IOW, if this has a lot of major versions, bucket them by major version, if it has few major versions, but a lot of minor versions, bucket it by that, etc.

Some resolvers I surveyed indeed do this, espacially from ecosystems that promote semver heavily (IIRC Cargo?) since major version bumps there imply more semantics, so this is at least a somewhat charted territory.

The Python community do not generally adhere to semver that strictly, but we may still be able to do it since the resolver never promised to return the best solution, but only a good enough one (i.e. if both 2.0.1 and 1.9.3 satisfy, the resolver does not have to choose 2.0.1).

@pradyunsg
Copy link
Member

The other part is how we handle failure-to-build. With our current processes, we could have to get build deps, do the build (or at best call prepare_metadata_for_build_wheel to get the info).

With binary search-like semantics, we'd have to be lenient about build failures and allow pip to attempt-to-use a different version of the package on failures (compared to outright failing as we do today).

Maybe provide a flag like --please-be-exhaustive-its-ok-ill-wait to disable the heuristics.

I think stopping after we've backtracked 100+ times and saying "hey, this is taking too long. Help me by reducing versions of $packages, or tell me to try harder with --option." is something we can do relatively easily now.

If folks are on board with this, let's pick a number (I've said 100, but I pulled that out of the air) and add this in?

@dstufft
Copy link
Member

dstufft commented Dec 1, 2020

Do we have a good sense of whether these cases where it takes a really long time to solve are typically cases where there is no answer and it's taking a long time to exhaustively search the space because our slow time per candidate means it takes hours.. or are these cases where there is a successful answer, but it just takes us awhile to get there?

@nijel
Copy link
Author

nijel commented Dec 1, 2020

@dstufft in my case, there was no suitable solution (see #9187 (comment)). I guessed which might be the problematic dependencies and with reduced set of packages it doesn't take that long and produces expected error. With full requirements-min.txt it didn't complete in hours.

With nearly 100 pinned dependencies, the space to search is enormous, and pip ends up with (maybe) infinitely printing "Requirement already satisfied:" when trying to search for some solution (see https://github.com/WeblateOrg/weblate/runs/1474960864?check_suite_focus=true for long log, it was killed after some hours). I just realized that the CI process is slightly more complex that what I've described - it first installs packages based on the ranges, then generates list of minimal versions and tries to adjust existing virtualenv. That's probably where the "Requirement already satisfied" logs come from.

The problematic dependency chain in my case was:

  • google-cloud-translate==3.0.0 from command line
  • setuptools==36.0.1 from command line
  • google-api-core[grpc] >= 1.22.0, < 2.0.0dev from google-cloud-translate==3.0.0
  • google-auth >= 1.19.1, < 2.0dev from google-api-core
  • setuptools>=40.3.0 from google-auth (any version in the range)

In the end, I think the problem is that it tries to find solution in areas where there can't be any. With full pip cache:

$ time pip install  google-cloud-translate==3.0.0 setuptools==36.0.1
...

real	0m6,206s
user	0m5,136s
sys	0m0,242s
$ time pip install  google-cloud-translate==3.0.0 setuptools==36.0.1 requests==2.20.0
...

real	0m28,724s
user	0m25,162s
sys	0m0,283s

In this case, adding requests==2.20.0 (which can be installed without any problem with either of the dependencies) multiplies the time nearly five times. This is caused by pip looking at different chardet and certifi versions for some reason.

@notatallshaw
Copy link
Contributor

notatallshaw commented Aug 30, 2021

Hi all,

I was able to reproduce OPs issue using their linked requirements.txt: https://github.com/pypa/pip/files/5618233/requirements-min.txt

I have been experimenting with an optimization where pip has a large solution space and has to backtrack, in my patched version of pip: #10201 (comment) . I would appreciate if anyone has test cases they want to try they provide them or try the patched version of pip themselves.

While it still took a few minutes it was able to hone in on a resolution impossible error which is that the user requested setuptools==36.0.1 but google-api-core[grpc] depends on setuptools>=40.3.0 as you can see with the following output:

$ python -m pip download -r req.txt -d ./downloads
Collecting bleach==3.1.1
  Using cached bleach-3.1.1-py2.py3-none-any.whl (150 kB)
Collecting borgbackup==1.1.9
  Using cached borgbackup-1.1.9.tar.gz (3.5 MB)
Collecting celery[redis]==4.4.5
  Using cached celery-4.4.5-py2.py3-none-any.whl (426 kB)
Collecting cssselect==1.0.0
  Using cached cssselect-1.0.0-py2.py3-none-any.whl (15 kB)
Collecting Cython==0.29.14
  Using cached Cython-0.29.14.tar.gz (2.1 MB)
Collecting diff-match-patch==20200713
  Using cached diff_match_patch-20200713-py3-none-any.whl (61 kB)
Collecting Django==3.1
  Using cached Django-3.1-py3-none-any.whl (7.8 MB)
Collecting django-appconf==1.0.3
  Using cached django_appconf-1.0.3-py2.py3-none-any.whl (6.4 kB)
Collecting django-compressor==2.4
  Using cached django_compressor-2.4-py2.py3-none-any.whl (126 kB)
Collecting django-crispy-forms==1.9.0
  Using cached django_crispy_forms-1.9.0-py2.py3-none-any.whl (107 kB)
Collecting django-filter==2.4.0
  Using cached django_filter-2.4.0-py3-none-any.whl (73 kB)
Collecting django-redis==4.11.0
  Using cached django_redis-4.11.0-py3-none-any.whl (18 kB)
Collecting djangorestframework==3.11.0
  Using cached djangorestframework-3.11.0-py3-none-any.whl (911 kB)
Collecting filelock==3.0.0
  Using cached filelock-3.0.0.tar.gz (5.6 kB)
Collecting GitPython==2.1.15
  Using cached GitPython-2.1.15-py2.py3-none-any.whl (452 kB)
Collecting hiredis==1.0.1
  Using cached hiredis-1.0.1.tar.gz (54 kB)
Collecting html2text==2019.8.11
  Using cached html2text-2019.8.11-py2.py3-none-any.whl (31 kB)
Collecting jellyfish==0.7.2
  Using cached jellyfish-0.7.2.tar.gz (133 kB)
Collecting jsonschema==3.0.0
  Using cached jsonschema-3.0.0-py2.py3-none-any.whl (54 kB)
Collecting lxml==4.4.0
  Using cached lxml-4.4.0.tar.gz (4.5 MB)
Collecting methodtools==0.4.2
  Using cached methodtools-0.4.2.tar.gz (3.0 kB)
Collecting misaka==2.1.0
  Using cached misaka-2.1.0.tar.gz (127 kB)
Collecting openpyxl==2.6.0
  Using cached openpyxl-2.6.0.tar.gz (172 kB)
Collecting Pillow==6.0.0
  Using cached Pillow-6.0.0.tar.gz (29.5 MB)
Collecting pycairo==1.15.3
  Using cached pycairo-1.15.3.tar.gz (177 kB)
Collecting pygobject==3.27.0
  Using cached pygobject-3.27.0.tar.gz (602 kB)
Collecting pyparsing==2.4.0
  Using cached pyparsing-2.4.0-py2.py3-none-any.whl (62 kB)
Collecting python-dateutil==2.8.1
  Using cached python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
Collecting python-redis-lock==3.4.0
  Using cached python_redis_lock-3.4.0-py2.py3-none-any.whl (11 kB)
Collecting requests==2.20.0
  Using cached requests-2.20.0-py2.py3-none-any.whl (60 kB)
Collecting sentry-sdk==0.13.0
  Using cached sentry_sdk-0.13.0-py2.py3-none-any.whl (91 kB)
Collecting setuptools==36.0.1
  Using cached setuptools-36.0.1-py2.py3-none-any.whl (476 kB)
Collecting siphashc==1.2
  Using cached siphashc-1.2.tar.gz (6.3 kB)
Collecting social-auth-app-django==4.0.0
  Using cached social_auth_app_django-4.0.0-py3-none-any.whl (24 kB)
Collecting social-auth-core==3.3.3
  Using cached social_auth_core-3.3.3-py3-none-any.whl (326 kB)
Collecting translate-toolkit==3.1.1
  Using cached translate-toolkit-3.1.1.tar.gz (6.0 MB)
Collecting translation-finder==2.6
  Using cached translation_finder-2.6-py3-none-any.whl (81 kB)
Collecting user-agents==2.0
  Using cached user-agents-2.0.tar.gz (9.4 kB)
Collecting weblate-language-data==2020.13
  Using cached weblate_language_data-2020.13-py3-none-any.whl (2.0 MB)
Collecting weblate-schemas==0.4
  Using cached weblate_schemas-0.4-py3-none-any.whl (7.9 kB)
Collecting Whoosh==2.7.4
  Using cached Whoosh-2.7.4-py2.py3-none-any.whl (468 kB)
Collecting aeidon==1.6.0
  Using cached aeidon-1.6.0-py3-none-any.whl (166 kB)
Collecting akismet==1.0.1
  Using cached akismet-1.0.1-py2.py3-none-any.whl (5.3 kB)
Collecting boto3==1.15.0
  Using cached boto3-1.15.0-py2.py3-none-any.whl (129 kB)
Collecting chardet==3.0.4
  Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Collecting django-auth-ldap==1.3.0
  Using cached django_auth_ldap-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting git-review==1.27.0
  Using cached git_review-1.27.0-py2.py3-none-any.whl (40 kB)
Collecting google-cloud-translate==3.0.0
  Using cached google_cloud_translate-3.0.0-py2.py3-none-any.whl (90 kB)
Collecting iniparse==0.5
  Using cached iniparse-0.5-py3-none-any.whl (24 kB)
Collecting Mercurial==5.2
  Using cached mercurial-5.2.tar.gz (7.3 MB)
Collecting mysqlclient==2.0.1
  Using cached mysqlclient-2.0.1.tar.gz (87 kB)
Collecting phply==1.2.5
  Using cached phply-1.2.5-py2.py3-none-any.whl (74 kB)
Collecting psycopg2-binary==2.7.7
  Using cached psycopg2-binary-2.7.7.tar.gz (428 kB)
Collecting python3-saml==1.2.1
  Using cached python3_saml-1.2.1-py3-none-any.whl (65 kB)
Collecting ruamel.yaml==0.16.0
  Using cached ruamel.yaml-0.16.0-py2.py3-none-any.whl (122 kB)
Collecting tesserocr==2.3.0
  Using cached tesserocr-2.3.0.tar.gz (54 kB)
Collecting zeep==3.2.0
  Using cached zeep-3.2.0-py2.py3-none-any.whl (98 kB)
Collecting six>=1.9.0
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting webencodings
  Using cached webencodings-0.5.1-py2.py3-none-any.whl (11 kB)
Collecting msgpack-python!=0.5.0,!=0.5.1,!=0.5.2,!=0.5.3,!=0.5.4,!=0.5.5,<=0.5.6,>=0.4.6
  Using cached msgpack-python-0.5.6.tar.gz (138 kB)
Collecting pytz>dev
  Using cached pytz-2021.1-py2.py3-none-any.whl (510 kB)
Collecting vine==1.3.0
  Using cached vine-1.3.0-py2.py3-none-any.whl (14 kB)
Collecting kombu<4.7,>=4.6.10
  Using cached kombu-4.6.11-py2.py3-none-any.whl (184 kB)
Collecting future>=0.18.0
  Using cached future-0.18.2.tar.gz (829 kB)
Collecting billiard<4.0,>=3.6.3.0
  Using cached billiard-3.6.4.0-py3-none-any.whl (89 kB)
Collecting redis>=3.2.0
  Using cached redis-3.5.3-py2.py3-none-any.whl (72 kB)
Collecting asgiref~=3.2.10
  Using cached asgiref-3.2.10-py3-none-any.whl (19 kB)
Collecting sqlparse>=0.2.2
  Using cached sqlparse-0.4.1-py3-none-any.whl (42 kB)
Collecting rcssmin==1.0.6
  Using cached rcssmin-1.0.6.tar.gz (582 kB)
Collecting rjsmin==1.1.0
  Using cached rjsmin-1.1.0.tar.gz (412 kB)
Collecting gitdb2<3,>=2
  Using cached gitdb2-2.0.6-py2.py3-none-any.whl (63 kB)
Collecting attrs>=17.4.0
  Using cached attrs-21.2.0-py2.py3-none-any.whl (53 kB)
Collecting pyrsistent>=0.14.0
  Using cached pyrsistent-0.18.0-cp39-cp39-manylinux1_x86_64.whl (117 kB)
Collecting wirerope>=0.4.2
  Using cached wirerope-0.4.5.tar.gz (8.6 kB)
Collecting cffi>=1.0.0
  Using cached cffi-1.14.6-cp39-cp39-manylinux1_x86_64.whl (405 kB)
Collecting jdcal
  Using cached jdcal-1.4.1-py2.py3-none-any.whl (9.5 kB)
Collecting et_xmlfile
  Using cached et_xmlfile-1.1.0-py3-none-any.whl (4.7 kB)
Collecting idna<2.8,>=2.5
  Using cached idna-2.7-py2.py3-none-any.whl (58 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.5.30-py2.py3-none-any.whl (145 kB)
Collecting urllib3<1.25,>=1.21.1
  Using cached urllib3-1.24.3-py2.py3-none-any.whl (118 kB)
Collecting requests-oauthlib>=0.6.1
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting oauthlib>=1.0.3
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting defusedxml>=0.5.0rc1
  Using cached defusedxml-0.7.1-py2.py3-none-any.whl (25 kB)
Collecting python3-openid>=3.0.10
  Using cached python3_openid-3.2.0-py3-none-any.whl (133 kB)
Collecting PyJWT>=1.4.0
  Using cached PyJWT-2.1.0-py3-none-any.whl (16 kB)
Collecting cryptography>=1.4
  Using cached cryptography-3.4.8-cp36-abi3-manylinux_2_24_x86_64.whl (3.0 MB)
Collecting ua-parser>=0.8.0
  Using cached ua_parser-0.10.0-py2.py3-none-any.whl (35 kB)
Collecting pyenchant<3.0,>=2.0
  Using cached pyenchant-2.0.0.tar.gz (64 kB)
Collecting s3transfer<0.4.0,>=0.3.0
  Using cached s3transfer-0.3.7-py2.py3-none-any.whl (73 kB)
Collecting botocore<1.19.0,>=1.18.0
  Using cached botocore-1.18.18-py2.py3-none-any.whl (6.7 MB)
Collecting jmespath<1.0.0,>=0.7.1
  Using cached jmespath-0.10.0-py2.py3-none-any.whl (24 kB)
Collecting pyldap
  Using cached pyldap-3.0.0.post1-py3-none-any.whl (3.0 kB)
Collecting google-cloud-core<2.0dev,>=1.1.0
  Using cached google_cloud_core-1.7.2-py2.py3-none-any.whl (28 kB)
Collecting google-api-core[grpc]<2.0.0dev,>=1.22.0
  Using cached google_api_core-1.31.2-py2.py3-none-any.whl (93 kB)
Collecting libcst>=0.2.5
  Using cached libcst-0.3.20-py3-none-any.whl (514 kB)
Collecting proto-plus>=0.4.0
  Using cached proto_plus-1.19.0-py3-none-any.whl (42 kB)
Collecting ply
  Using cached ply-3.11-py2.py3-none-any.whl (49 kB)
Collecting xmlsec>=0.6.0
  Using cached xmlsec-1.3.11.tar.gz (61 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Installing backend dependencies ... done
    Preparing wheel metadata ... done
Collecting isodate>=0.5.0
  Using cached isodate-0.6.0-py2.py3-none-any.whl (45 kB)
Collecting cached-property>=1.3.0
  Using cached cached_property-1.5.2-py2.py3-none-any.whl (7.6 kB)
Collecting requests-toolbelt>=0.7.1
  Using cached requests_toolbelt-0.9.1-py2.py3-none-any.whl (54 kB)
Collecting appdirs>=1.4.0
  Using cached appdirs-1.4.4-py2.py3-none-any.whl (9.6 kB)
Collecting pycparser
  Using cached pycparser-2.20-py2.py3-none-any.whl (112 kB)
Collecting smmap2>=2.0.0
  Using cached smmap2-3.0.1-py3-none-any.whl (1.1 kB)
Collecting google-auth<2.0dev,>=1.25.0
  Using cached google_auth-1.35.0-py2.py3-none-any.whl (152 kB)
Collecting packaging>=14.3
  Using cached packaging-21.0-py3-none-any.whl (40 kB)
Collecting google-api-core[grpc]<2.0.0dev,>=1.22.0
  Using cached google_api_core-1.31.1-py2.py3-none-any.whl (93 kB)
  Using cached google_api_core-1.31.0-py2.py3-none-any.whl (93 kB)
  Using cached google_api_core-1.30.0-py2.py3-none-any.whl (93 kB)
  Using cached google_api_core-1.29.0-py2.py3-none-any.whl (93 kB)
  Using cached google_api_core-1.28.0-py2.py3-none-any.whl (92 kB)
  Using cached google_api_core-1.27.0-py2.py3-none-any.whl (93 kB)
  Using cached google_api_core-1.26.3-py2.py3-none-any.whl (93 kB)
  Using cached google_api_core-1.26.2-py2.py3-none-any.whl (93 kB)
  Using cached google_api_core-1.26.1-py2.py3-none-any.whl (92 kB)
  Using cached google_api_core-1.26.0-py2.py3-none-any.whl (92 kB)
  Using cached google_api_core-1.25.1-py2.py3-none-any.whl (92 kB)
  Using cached google_api_core-1.25.0-py2.py3-none-any.whl (92 kB)
  Using cached google_api_core-1.24.1-py2.py3-none-any.whl (92 kB)
Collecting protobuf>=3.12.0
  Using cached protobuf-3.17.3-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl (1.0 MB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.53.0-py2.py3-none-any.whl (198 kB)
Collecting grpcio<2.0dev,>=1.29.0
  Using cached grpcio-1.39.0-cp39-cp39-manylinux2014_x86_64.whl (4.3 MB)
Collecting amqp<2.7,>=2.6.0
  Using cached amqp-2.6.1-py2.py3-none-any.whl (48 kB)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting pyyaml>=5.2
  Using cached PyYAML-5.4.1-cp39-cp39-manylinux1_x86_64.whl (630 kB)
Collecting typing-extensions>=3.7.4.2
  Using cached typing_extensions-3.10.0.1-py3-none-any.whl (26 kB)
Collecting python-ldap>=3.0.0b1
  Using cached python-ldap-3.3.1.tar.gz (379 kB)
Collecting cachetools<5.0,>=2.0.0
  Using cached cachetools-4.2.2-py3-none-any.whl (11 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.7.2-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-auth<2.0dev,>=1.21.1
  Using cached google_auth-1.34.0-py2.py3-none-any.whl (152 kB)
  Using cached google_auth-1.33.1-py2.py3-none-any.whl (152 kB)
  Using cached google_auth-1.33.0-py2.py3-none-any.whl (151 kB)
  Using cached google_auth-1.32.1-py2.py3-none-any.whl (147 kB)
  Using cached google_auth-1.32.0-py2.py3-none-any.whl (147 kB)
  Using cached google_auth-1.31.0-py2.py3-none-any.whl (147 kB)
  Using cached google_auth-1.30.2-py2.py3-none-any.whl (146 kB)
  Using cached google_auth-1.30.1-py2.py3-none-any.whl (146 kB)
  Using cached google_auth-1.30.0-py2.py3-none-any.whl (146 kB)
  Using cached google_auth-1.29.0-py2.py3-none-any.whl (142 kB)
  Using cached google_auth-1.28.1-py2.py3-none-any.whl (136 kB)
  Using cached google_auth-1.28.0-py2.py3-none-any.whl (136 kB)
  Using cached google_auth-1.27.1-py2.py3-none-any.whl (136 kB)
  Using cached google_auth-1.27.0-py2.py3-none-any.whl (135 kB)
  Using cached google_auth-1.26.1-py2.py3-none-any.whl (116 kB)
  Using cached google_auth-1.26.0-py2.py3-none-any.whl (135 kB)
  Using cached google_auth-1.25.0-py2.py3-none-any.whl (116 kB)
  Using cached google_auth-1.24.0-py2.py3-none-any.whl (114 kB)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of amqp to determine which version is compatible with other requirements. This could take a while.
Collecting amqp<2.7,>=2.6.0
  Using cached amqp-2.6.0-py2.py3-none-any.whl (47 kB)
INFO: pip is looking at multiple versions of webencodings to determine which version is compatible with other requirements. This could take a while.
Collecting webencodings
  Using cached webencodings-0.5.tar.gz (9.5 kB)
INFO: pip is looking at multiple versions of pyldap to determine which version is compatible with other requirements. This could take a while.
Collecting pyldap
  Using cached pyldap-3.0.0.tar.gz (1.1 kB)
INFO: pip is looking at multiple versions of ply to determine which version is compatible with other requirements. This could take a while.
Collecting ply
  Using cached ply-3.10.tar.gz (150 kB)
INFO: pip is looking at multiple versions of jdcal to determine which version is compatible with other requirements. This could take a while.
Collecting jdcal
  Using cached jdcal-1.4-py2.py3-none-any.whl (9.5 kB)
INFO: pip is looking at multiple versions of et-xmlfile to determine which version is compatible with other requirements. This could take a while.
Collecting et_xmlfile
  Using cached et_xmlfile-1.0.1.tar.gz (8.4 kB)
INFO: pip is looking at multiple versions of celery to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of xmlsec to determine which version is compatible with other requirements. This could take a while.
Collecting xmlsec>=0.6.0
  Using cached xmlsec-1.3.10.tar.gz (62 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Installing backend dependencies ... done
    Preparing wheel metadata ... done
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of wirerope to determine which version is compatible with other requirements. This could take a while.
Collecting wirerope>=0.4.2
  Using cached wirerope-0.4.4.tar.gz (8.6 kB)
INFO: pip is looking at multiple versions of urllib3 to determine which version is compatible with other requirements. This could take a while.
Collecting urllib3<1.25,>=1.21.1
  Using cached urllib3-1.24.2-py2.py3-none-any.whl (131 kB)
INFO: pip is looking at multiple versions of ua-parser to determine which version is compatible with other requirements. This could take a while.
Collecting ua-parser>=0.8.0
  Using cached ua_parser-0.9.0-py2.py3-none-any.whl (35 kB)
INFO: pip is looking at multiple versions of sqlparse to determine which version is compatible with other requirements. This could take a while.
Collecting sqlparse>=0.2.2
  Using cached sqlparse-0.4.0-py3-none-any.whl (42 kB)
INFO: pip is looking at multiple versions of six to determine which version is compatible with other requirements. This could take a while.
Collecting six>=1.9.0
  Using cached six-1.15.0-py2.py3-none-any.whl (10 kB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run, you can press Ctrl + C to do so. To improve how pip performs, tell us what happened here: https://pip.pypa.io/surveys/backtracking
INFO: pip is looking at multiple versions of s3transfer to determine which version is compatible with other requirements. This could take a while.
Collecting s3transfer<0.4.0,>=0.3.0
  Using cached s3transfer-0.3.6-py2.py3-none-any.whl (73 kB)
INFO: pip is looking at multiple versions of requests-toolbelt to determine which version is compatible with other requirements. This could take a while.
Collecting requests-toolbelt>=0.7.1
  Using cached requests_toolbelt-0.9.0-py2.py3-none-any.whl (54 kB)
INFO: pip is looking at multiple versions of requests-oauthlib to determine which version is compatible with other requirements. This could take a while.
Collecting requests-oauthlib>=0.6.1
  Using cached requests_oauthlib-1.2.0-py2.py3-none-any.whl (22 kB)
INFO: pip is looking at multiple versions of redis to determine which version is compatible with other requirements. This could take a while.
Collecting redis>=3.2.0
  Using cached redis-3.5.2-py2.py3-none-any.whl (71 kB)
INFO: pip is looking at multiple versions of pytz to determine which version is compatible with other requirements. This could take a while.
Collecting pytz>dev
  Using cached pytz-2020.5-py2.py3-none-any.whl (510 kB)
INFO: pip is looking at multiple versions of python3-openid to determine which version is compatible with other requirements. This could take a while.
Collecting python3-openid>=3.0.10
  Using cached python3_openid-3.1.0-py3-none-any.whl (130 kB)
INFO: pip is looking at multiple versions of pyrsistent to determine which version is compatible with other requirements. This could take a while.
Collecting pyrsistent>=0.14.0
  Using cached pyrsistent-0.17.3.tar.gz (106 kB)
INFO: pip is looking at multiple versions of pyjwt to determine which version is compatible with other requirements. This could take a while.
Collecting PyJWT>=1.4.0
  Using cached PyJWT-2.0.1-py3-none-any.whl (15 kB)
INFO: pip is looking at multiple versions of pyenchant to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of proto-plus to determine which version is compatible with other requirements. This could take a while.
Collecting proto-plus>=0.4.0
  Using cached proto_plus-1.18.1-py3-none-any.whl (42 kB)
INFO: pip is looking at multiple versions of oauthlib to determine which version is compatible with other requirements. This could take a while.
Collecting oauthlib>=1.0.3
  Using cached oauthlib-3.1.0-py2.py3-none-any.whl (147 kB)
INFO: pip is looking at multiple versions of msgpack-python to determine which version is compatible with other requirements. This could take a while.
Collecting msgpack-python!=0.5.0,!=0.5.1,!=0.5.2,!=0.5.3,!=0.5.4,!=0.5.5,<=0.5.6,>=0.4.6
  Using cached msgpack-python-0.4.8.tar.gz (113 kB)
INFO: pip is looking at multiple versions of libcst to determine which version is compatible with other requirements. This could take a while.
Collecting libcst>=0.2.5
  Using cached libcst-0.3.19-py3-none-any.whl (513 kB)
INFO: pip is looking at multiple versions of kombu to determine which version is compatible with other requirements. This could take a while.
Collecting kombu<4.7,>=4.6.10
  Using cached kombu-4.6.10-py2.py3-none-any.whl (184 kB)
INFO: pip is looking at multiple versions of jmespath to determine which version is compatible with other requirements. This could take a while.
Collecting jmespath<1.0.0,>=0.7.1
  Using cached jmespath-0.9.5-py2.py3-none-any.whl (24 kB)
INFO: pip is looking at multiple versions of isodate to determine which version is compatible with other requirements. This could take a while.
Collecting isodate>=0.5.0
  Using cached isodate-0.5.4.tar.gz (27 kB)
INFO: pip is looking at multiple versions of idna to determine which version is compatible with other requirements. This could take a while.
Collecting idna<2.8,>=2.5
  Using cached idna-2.6-py2.py3-none-any.whl (56 kB)
INFO: pip is looking at multiple versions of google-cloud-core to determine which version is compatible with other requirements. This could take a while.
Collecting google-cloud-core<2.0dev,>=1.1.0
  Using cached google_cloud_core-1.7.1-py2.py3-none-any.whl (28 kB)
Collecting google-auth<2.0dev,>=1.21.1
  Using cached google_auth-1.23.0-py2.py3-none-any.whl (114 kB)
  Using cached google_auth-1.22.1-py2.py3-none-any.whl (114 kB)
  Using cached google_auth-1.22.0-py2.py3-none-any.whl (114 kB)
Collecting aiohttp<4.0.0dev,>=3.6.2
  Downloading aiohttp-3.7.4.post0-cp39-cp39-manylinux2014_x86_64.whl (1.4 MB)
     |████████████████████████████████| 1.4 MB 3.1 MB/s
Collecting google-auth<2.0dev,>=1.21.1
  Using cached google_auth-1.21.3-py2.py3-none-any.whl (93 kB)
  Using cached google_auth-1.21.2-py2.py3-none-any.whl (93 kB)
  Using cached google_auth-1.21.1-py2.py3-none-any.whl (93 kB)
INFO: pip is looking at multiple versions of google-api-core[grpc] to determine which version is compatible with other requirements. This could take a while.
Collecting google-api-core[grpc]<2.0.0dev,>=1.22.0
  Using cached google_api_core-1.24.0-py2.py3-none-any.whl (91 kB)
  Using cached google_api_core-1.23.0-py2.py3-none-any.whl (91 kB)
  Using cached google_api_core-1.22.4-py2.py3-none-any.whl (91 kB)
  Using cached google_api_core-1.22.3-py2.py3-none-any.whl (91 kB)
  Using cached google_api_core-1.22.2-py2.py3-none-any.whl (91 kB)
  Using cached google_api_core-1.22.1-py2.py3-none-any.whl (91 kB)
Collecting google-auth<2.0dev,>=1.19.1
  Using cached google_auth-1.21.0-py2.py3-none-any.whl (92 kB)
  Using cached google_auth-1.20.1-py2.py3-none-any.whl (91 kB)
  Using cached google_auth-1.20.0-py2.py3-none-any.whl (91 kB)
  Using cached google_auth-1.19.2-py2.py3-none-any.whl (91 kB)
  Using cached google_auth-1.19.1-py2.py3-none-any.whl (91 kB)
Collecting google-api-core[grpc]<2.0.0dev,>=1.22.0
  Using cached google_api_core-1.22.0-py2.py3-none-any.whl (91 kB)
INFO: pip is looking at multiple versions of google-api-core[grpc] to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of gitdb2 to determine which version is compatible with other requirements. This could take a while.
Collecting gitdb2<3,>=2
  Using cached gitdb2-2.0.5-py2.py3-none-any.whl (62 kB)
  Using cached gitdb2-2.0.4-py2.py3-none-any.whl (62 kB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run, you can press Ctrl + C to do so. To improve how pip performs, tell us what happened here: https://pip.pypa.io/surveys/backtracking
INFO: pip is looking at multiple versions of future to determine which version is compatible with other requirements. This could take a while.
Collecting future>=0.18.0
  Using cached future-0.18.1.tar.gz (828 kB)
  Using cached future-0.18.0.tar.gz (830 kB)
INFO: pip is looking at multiple versions of defusedxml to determine which version is compatible with other requirements. This could take a while.
Collecting defusedxml>=0.5.0rc1
  Using cached defusedxml-0.7.0-py2.py3-none-any.whl (25 kB)
  Using cached defusedxml-0.7.0rc2-py2.py3-none-any.whl (25 kB)
INFO: pip is looking at multiple versions of cryptography to determine which version is compatible with other requirements. This could take a while.
Collecting cryptography>=1.4
  Using cached cryptography-3.4.7-cp36-abi3-manylinux2014_x86_64.whl (3.2 MB)
  Using cached cryptography-3.4.6-cp36-abi3-manylinux2014_x86_64.whl (3.2 MB)
INFO: pip is looking at multiple versions of cffi to determine which version is compatible with other requirements. This could take a while.
Collecting cffi>=1.0.0
  Using cached cffi-1.14.5-cp39-cp39-manylinux1_x86_64.whl (406 kB)
  Using cached cffi-1.14.4-cp39-cp39-manylinux1_x86_64.whl (405 kB)
INFO: pip is looking at multiple versions of certifi to determine which version is compatible with other requirements. This could take a while.
Collecting certifi>=2017.4.17
  Using cached certifi-2020.12.5-py2.py3-none-any.whl (147 kB)
  Using cached certifi-2020.11.8-py2.py3-none-any.whl (155 kB)
INFO: pip is looking at multiple versions of cached-property to determine which version is compatible with other requirements. This could take a while.
Collecting cached-property>=1.3.0
  Using cached cached_property-1.5.1-py2.py3-none-any.whl (6.0 kB)
INFO: pip is looking at multiple versions of amqp to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of webencodings to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of pyldap to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of ply to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of jdcal to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of et-xmlfile to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of celery to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of xmlsec to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of wirerope to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of urllib3 to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of ua-parser to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of sqlparse to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of six to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of s3transfer to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of requests-toolbelt to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of requests-oauthlib to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of redis to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of pytz to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of python3-openid to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of pyrsistent to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of pyjwt to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of pyenchant to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of proto-plus to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of oauthlib to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of msgpack-python to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of libcst to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of kombu to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of jmespath to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of isodate to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of idna to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of google-cloud-core to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of gitdb2 to determine which version is compatible with other requirements. This could take a while.
  Using cached cached_property-1.4.3-py2.py3-none-any.whl (10 kB)
INFO: pip is looking at multiple versions of botocore to determine which version is compatible with other requirements. This could take a while.
Collecting botocore<1.19.0,>=1.18.0
  Using cached botocore-1.18.17-py2.py3-none-any.whl (6.7 MB)
INFO: pip is looking at multiple versions of future to determine which version is compatible with other requirements. This could take a while.
  Using cached botocore-1.18.16-py2.py3-none-any.whl (6.7 MB)
INFO: pip is looking at multiple versions of billiard to determine which version is compatible with other requirements. This could take a while.
Collecting billiard<4.0,>=3.6.3.0
  Using cached billiard-3.6.3.0-py3-none-any.whl (89 kB)
INFO: pip is looking at multiple versions of defusedxml to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of attrs to determine which version is compatible with other requirements. This could take a while.
Collecting attrs>=17.4.0
  Using cached attrs-20.3.0-py2.py3-none-any.whl (49 kB)
INFO: pip is looking at multiple versions of cryptography to determine which version is compatible with other requirements. This could take a while.
  Using cached attrs-20.2.0-py2.py3-none-any.whl (48 kB)
INFO: pip is looking at multiple versions of asgiref to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of appdirs to determine which version is compatible with other requirements. This could take a while.
Collecting appdirs>=1.4.0
  Using cached appdirs-1.4.3-py2.py3-none-any.whl (12 kB)
INFO: pip is looking at multiple versions of cffi to determine which version is compatible with other requirements. This could take a while.
  Using cached appdirs-1.4.2-py2.py3-none-any.whl (12 kB)
INFO: pip is looking at multiple versions of vine to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of rjsmin to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of rcssmin to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of zeep to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of tesserocr to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of ruamel-yaml to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of python3-saml to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of psycopg2-binary to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of phply to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of mysqlclient to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of mercurial to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of iniparse to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of google-cloud-translate to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of git-review to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of django-auth-ldap to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of chardet to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of boto3 to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of akismet to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of aeidon to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of whoosh to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of weblate-schemas to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of weblate-language-data to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of user-agents to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of translation-finder to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of translate-toolkit to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of social-auth-core to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of social-auth-app-django to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of siphashc to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of sentry-sdk to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of requests to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of python-redis-lock to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of python-dateutil to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of pyparsing to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of pygobject to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of pycairo to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of pillow to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of openpyxl to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of misaka to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of methodtools to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of lxml to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of jsonschema to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of jellyfish to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of html2text to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of hiredis to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of gitpython to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of filelock to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of djangorestframework to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of django-redis to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of django-filter to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of django-crispy-forms to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of django-compressor to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of django-appconf to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of django to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of diff-match-patch to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of cython to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of cssselect to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of celery[redis] to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of borgbackup to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of <Python from Requires-Python> to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of bleach to determine which version is compatible with other requirements. This could take a while.
ERROR: Cannot install -r req.txt (line 19), google-api-core[grpc]==1.25.0, google-api-core[grpc]==1.25.1, google-api-core[grpc]==1.26.0, google-api-core[grpc]==1.26.1, google-api-core[grpc]==1.26.2, google-api-core[grpc]==1.26.3, google-api-core[grpc]==1.27.0, google-api-core[grpc]==1.28.0, google-api-core[grpc]==1.29.0, google-api-core[grpc]==1.30.0, google-api-core[grpc]==1.31.0, google-api-core[grpc]==1.31.1, google-api-core[grpc]==1.31.2 and setuptools==36.0.1 because these package versions have conflicting dependencies.

The conflict is caused by:
    The user requested setuptools==36.0.1
    The user requested setuptools==36.0.1
    jsonschema 3.0.0 depends on setuptools
    google-api-core[grpc] 1.31.2 depends on setuptools>=40.3.0
    The user requested setuptools==36.0.1
    The user requested setuptools==36.0.1
    jsonschema 3.0.0 depends on setuptools
    google-api-core[grpc] 1.31.1 depends on setuptools>=40.3.0
    The user requested setuptools==36.0.1
    The user requested setuptools==36.0.1
    jsonschema 3.0.0 depends on setuptools
    google-api-core[grpc] 1.31.0 depends on setuptools>=40.3.0
    The user requested setuptools==36.0.1
    The user requested setuptools==36.0.1
    jsonschema 3.0.0 depends on setuptools
    google-api-core[grpc] 1.30.0 depends on setuptools>=40.3.0
    The user requested setuptools==36.0.1
    The user requested setuptools==36.0.1
    jsonschema 3.0.0 depends on setuptools
    google-api-core[grpc] 1.29.0 depends on setuptools>=40.3.0
    The user requested setuptools==36.0.1
    The user requested setuptools==36.0.1
    jsonschema 3.0.0 depends on setuptools
    google-api-core[grpc] 1.28.0 depends on setuptools>=40.3.0
    The user requested setuptools==36.0.1
    The user requested setuptools==36.0.1
    jsonschema 3.0.0 depends on setuptools
    google-api-core[grpc] 1.27.0 depends on setuptools>=40.3.0
    The user requested setuptools==36.0.1
    The user requested setuptools==36.0.1
    jsonschema 3.0.0 depends on setuptools
    google-api-core[grpc] 1.26.3 depends on setuptools>=40.3.0
    The user requested setuptools==36.0.1
    The user requested setuptools==36.0.1
    jsonschema 3.0.0 depends on setuptools
    google-api-core[grpc] 1.26.2 depends on setuptools>=40.3.0
    The user requested setuptools==36.0.1
    The user requested setuptools==36.0.1
    jsonschema 3.0.0 depends on setuptools
    google-api-core[grpc] 1.26.1 depends on setuptools>=40.3.0
    The user requested setuptools==36.0.1
    The user requested setuptools==36.0.1
    jsonschema 3.0.0 depends on setuptools
    google-api-core[grpc] 1.26.0 depends on setuptools>=40.3.0
    The user requested setuptools==36.0.1
    The user requested setuptools==36.0.1
    jsonschema 3.0.0 depends on setuptools
    google-api-core[grpc] 1.25.1 depends on setuptools>=40.3.0
    The user requested setuptools==36.0.1
    The user requested setuptools==36.0.1
    jsonschema 3.0.0 depends on setuptools
    google-api-core[grpc] 1.25.0 depends on setuptools>=40.3.0

To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/user_guide/#fixing-conflicting-dependencies

@NightMachinery
Copy link

I install any packages I might use in my small scripts, one-liners, and interactive usage to the global Python env. These amount to a lot of packages, but I do not particularly care about conflicts between them; If some package doesn't play nice with other packages' latest versions, I just move the scripts using that particular package into a virtual env. (Which has never happened until now using the legacy resolver.)

This new resolver takes so long on this use case that I have never seen it actually completed. Here is my requirements.txt.

@pradyunsg
Copy link
Member

Yea, this looks like a case where tree trimming is what we need.

@notatallshaw
Copy link
Contributor

but I do not particularly care about conflicts between them; If some package doesn't play nice with other packages' latest versions,

A requirements file is supposed to specify the versions of packages which work with each other. E.g. If I use async / await code I should specify Python 3.6+, no amount of fixing the pip resolver would let that code work on Python 2.7 or Python 3.5, the same thing if I call a method on a library that requires a certain version of that library.

If you don't care if packages work on not you can keep using pip 20.2.

This new resolver takes so long on this use case that I have never seen it actually completed. Here is my requirements.txt.

Your use case as you've specified is not supported by the new resolver, it will not install packages that say that are incompatible with each other.

That said I agree the new resolver should be optimized to be faster, and in fact I my experimental version of pip I can install your requirements.txt in a few minutes without any issues: #10201 (comment)

@pfmoore
Copy link
Member

pfmoore commented Aug 31, 2021

Your use case as you've specified is not supported by the new resolver

I'd actually state that more strongly. Your use case isn't supported by pip, unless you use --no-deps and take responsibility for handling all dependency management yourself. We've always considered failing to satisfy a package's dependencies to be a bug, and we make no promises that pip will work if used in an environment that has preinstalled packages that fail pip check.

"Pip takes too long to tell me I have conflicting dependencies" is a valid bug report. But "pip won't install a set of conflicting requirements" isn't.

@Suor
Copy link

Suor commented Sep 3, 2021

I would say if pip used to work with certain requirements file and then after upgrade it either refuses to or takes hours then this is a showstopper bug and shouldn't be treated as business as usual. People rely on such infrastructure tools to be backwards compatible.

@notatallshaw
Copy link
Contributor

I would say if pip used to work with certain requirements file and then after upgrade it either refuses to or takes hours then this is a showstopper bug and shouldn't be treated as business as usual. People rely on such infrastructure tools to be backwards compatible.

People also require such infrastructure to be correct, prior to 20.3 pip would install packages that were explicitly incompatible. If you want to be fast but wrong you can pin to pip 20.2.

Do you have a reproducible example? I am actually trying to do something about the performance issue by experimenting with optimizations: #10201 (comment) . A lot of people complain about this issue but don't actually provide an example that has enough information to reproduce and therefore people who are trying to do something about it (like myself) can't actually help you.

@NightMachinery
Copy link

I think my use case will be solved just by having the ability to pin packages to the latest version: telethon=latest, as I want my global packages to just be the latest version regardless of any conflicts. I do not want pip as a version resolver, I just want it to automatically install the dependencies. Kind of like how apt or brew work.

@notatallshaw
Copy link
Contributor

notatallshaw commented Sep 3, 2021

I think my use case will be solved just by having the ability to pin packages to the latest version: telethon=latest, as I want my global packages to just be the latest version regardless of any conflicts. I do not want pip as a version resolver, I just want it to automatically install the dependencies. Kind of like how apt or brew work.

Is that a feature that you or someone has requested or someone is implementing? Or is just just a musing on a nice to have? Personally I'm not sold on =latest being that useful, it seems like it is both likely expose you to breaking APIs and end up with ResolutionImpossible being raised. And it won't help for the sub-dependencies, if you depend on A==latest and B==latest but A depends on C, D, and E, and B depends on D, E, and F then pip will now need to resolve any conflicts in the requirements of D and E.

FYI I think a feature with a similar effect is one I've written up here: #10417 where you set --max-backtracks 0. With this if you set your requirements to A, B and let's say A depends on B==v2, but the latest B is v3 I think this should still be installable as by the time pip is trying to pin B it has the requirement that B must be v2 according to A. Where as if you set your requirements to A==latest, B==latest then you will end up with ResolutionImpossible.

But I am a while of being able to submit a PR for this, so no explicit work is being done on it right now.

@Suor
Copy link

Suor commented Sep 6, 2021

Do you have a reproducible example?

I have a requirements file that I cannot install - too slow - into an existing env. It installs into a freshly created one though.

@Suor
Copy link

Suor commented Sep 6, 2021

A way to hang up pip. Download the requirements file, make a fresh virtualenv (I used Python 3.8) and:

pip install pip==20.2
pip install -r requirements-broken.txt 
pip install -U pip
pip install -r requirements-broken.txt 

requirements-broken.txt

@notatallshaw
Copy link
Contributor

notatallshaw commented Sep 6, 2021

A way to hang up pip. Download the requirements file, make a fresh virtualenv (I used Python 3.8) and:

pip install pip==20.2
pip install -r requirements-broken.txt 
pip install -U pip
pip install -r requirements-broken.txt 

requirements-broken.txt

So the first installation of the requirements using pip 20.2 creates a broken environment, so from this point on wards I don't think this scenario is supported (up to the pip maintainers). Pip explicitly tells you this when you install:

ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts.

We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default.

gcsfs 2021.6.1 requires fsspec==2021.06.1, but you'll have fsspec 2021.8.1 which is incompatible.
s3fs 2021.6.1 requires fsspec==2021.06.1, but you'll have fsspec 2021.8.1 which is incompatible.

At this point I don't think installing the same requirements with pip 20.3+ has any hope because it needs to explicitly install older dependencies than already installed. I could be wrong maybe there are some incantation of option flags that let pip install older dependencies than already installed when using a requirements file?

Unfortunately though pip 20.3+ doesn't tell you why it can't install because because it gets stuck backtracking within the possible set of dependencies to see if there is some solution (and goes down the wrong path). However when I use the version of pip I have created here which attempts to optimize this large backtracking situation it very quickly gives the error on why this current environment won't work:

ERROR: Cannot install -r req.txt (line 22) and dvc[azure,gdrive,gs,s3,ssh]==2.5.4 because these package versions have conflicting dependencies.

The conflict is caused by:
    aioboto3 9.2.0 depends on aiobotocore[boto3]==1.3.3
    dvc[azure,gdrive,gs,s3,ssh] 2.5.4 depends on aiobotocore[boto3]==1.3.0; extra == "s3"

@pfmoore
Copy link
Member

pfmoore commented Sep 6, 2021

So the first installation of the requirements using pip 20.2 creates a broken environment, so from this point on wards I don't think this scenario is supported (up to the pip maintainers).

Correct. If you have broken dependencies in your existing installation (you can check this with pip check) you need to fix those before installing anything new. We do not support installing into broken environments - it might work, or it might not, but there's no guarantees.

@Suor
Copy link

Suor commented Sep 10, 2021

We do not support installing into broken environments

It would be fine if that simply didn't work, hanging up is worse.

@pfmoore
Copy link
Member

pfmoore commented Sep 10, 2021

It would be fine if that simply didn't work, hanging up is worse.

You can run pip check on your environment first. That will report any issues without hanging.

@notatallshaw
Copy link
Contributor

requirements.txt:

apache_beam
gcsfs
google-api-python-client
google
h5py<3.0
hdfs
hyperopt
interface
jsonpickle
keras
kfp
kfserving
kubernetes==10.0.1
matplotlib
numpy
pandas
pymysql
PyYAML
shap
tensorboard
tensorflow>=2.0.0
tensorflow-transform>=0.15.0

@njiles Apologies this is going back a while, I've been scouring these posts for all reproducible examples of pip taking a very long time backtracking. This is one such example.

Using the changes I propose here it is able to backtrack much more efficiently and after a little while it gives the following error, hope that helps:

ERROR: Cannot install -r .\requirements.txt (line 21) and h5py<3.0 because these package versions have conflicting dependencies.

The conflict is caused by:
    The user requested h5py<3.0
    tensorflow 2.6.0 depends on h5py~=3.1.0
    The user requested h5py<3.0
    tensorflow 2.5.1 depends on h5py~=3.1.0
    The user requested h5py<3.0
    tensorflow 2.5.0 depends on h5py~=3.1.0

To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/user_guide/#fixing-conflicting-dependencies

fmigneault added a commit to Ouranosinc/Magpie that referenced this issue Sep 17, 2021
…t, astroid, pycodestyle never resolve in endless retry loop)
@stonebig
Copy link
Contributor

stonebig commented Sep 18, 2021

great unexpected news for me :

@Keramblock
Copy link

That is good Idea, but awful solution. Downloading all packages just to resolve dependencies? what was you thinking ?

@notatallshaw
Copy link
Contributor

notatallshaw commented Oct 11, 2021

That is good Idea, but awful solution. Downloading all packages just to resolve dependencies? what was you thinking ?

That's not what Pip is doing. Pip downloads the latest of each requirement and, in general, only downloads an older package if there are conflicting requirements.

@notatallshaw
Copy link
Contributor

I think this issue can now be closed with #10481

New reports will require new analysis and the reasoning will be different.

@janwirth

This comment has been minimized.

@notatallshaw

This comment has been minimized.

@pradyunsg
Copy link
Member

pradyunsg commented Oct 11, 2021

Since this issue was filed, we've made a significant improvements to the dependency resolution logic, to our documention around dependency resolution and improved behaviours for many of the reported instances of poor behaviour.

I'd like to thank everyone who's engaged constructively in this discussion.


If the behaviour of pip's dependency resolver is still an issue for your usecase with pip 21.3 or newer, please file a new issue for your usecase. Notably, please do NOT file a blanket issue for this problem like "pip's resolver is slow" or "pip's resolver backtracks a lot" -- such issues increase the amount of effort pip's maintainers have to put in to triage through the reports, to consolidate similar cases as well as to figure out what's actionable about each of them.

We'd appreciate bug reports containing clear information about how to reproduce the behaviour that you're seeing, and what you'd want it to do differently. I'm sure there's still a lot of ways we can improve the behaviour of pip's dependency resolver, and reports that contain enough information to reproduce the issue will help us identify and improve them.

As a reminder, currently, all of pip's maintainers contribute to pip in a largely/completely volunteer capacity. Futher, dependency resolution is a complicated problem, both computationally (NP-complete, in case you're algorithmically minded) as well as in terms of what the "right answer" is for various usecases (the strategies for "getting the right answer" are often contradictory).


I'm gonna go ahead and lock this now, since this thread has already started going off-topic.

@pypa pypa locked as resolved and limited conversation to collaborators Oct 11, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
kind: crash For situations where pip crashes
Projects
None yet
Development

No branches or pull requests