Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Explicitly tell install_requires and avoid use pip internals #56

Open
michelts opened this issue Mar 3, 2020 · 0 comments · May be fixed by #57
Open

Explicitly tell install_requires and avoid use pip internals #56

michelts opened this issue Mar 3, 2020 · 0 comments · May be fixed by #57

Comments

@michelts
Copy link

michelts commented Mar 3, 2020

Hi @clemfromspace

I'm trying to install scrapy-selenium using the setup.py file (to enjoy the possibility of use a remote selenium operation), but it fails if I have pip version greater than 19.x.

Traceback (most recent call last):
  File "setup.py", line 5, in <module>
    from pip.download import PipSession
ModuleNotFoundError: No module named 'pip.download'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "setup.py", line 10, in <module>
    from pip._internal.download import PipSession
ModuleNotFoundError: No module named 'pip._internal.download'

This is because you are relying on pip internals to specify the requirements for the project.

I'm not an experienced packer :), but I found an article that recommends to use install_requires and test_requires directly as list rather than using requirements files.

If we use list in the setup.py file, we won't need to use pip internals nor need to update the setup.py file always pip changes.

Do you agree hardcoding the versions in the setup.py file?

Thanks in advance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant