Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to keep setup.py install_requires and Pipfile in sync #1263

Closed
Korijn opened this issue Jan 3, 2018 · 49 comments
Closed

How to keep setup.py install_requires and Pipfile in sync #1263

Korijn opened this issue Jan 3, 2018 · 49 comments

Comments

@Korijn
Copy link

Korijn commented Jan 3, 2018

I am working on a Python package with pipenv and am faced with the challenge of keeping setup(install_requires=...) in sync with my Pipfile's runtime dependencies. Is there a recommended approach?

[Answer 2019-08-23] Best practise as also discussed below:

For applications that are deployed or distributed in installers, I just use Pipfile.

For applications that are distributed as packages with setup.py, I put all my dependencies in install_requires.

Then I make my Pipfile depend on setup.py by running pipenv install '-e .'.

@Place1
Copy link

Place1 commented Jan 7, 2018

Does pipenv have a python API that could be used? I manually update the list as I work on a project but the following could be nice:

from setuptools import setup
from pipenv import find_install_requires

setup(
    # ...
    install_requires=find_install_requires()
    # ...
)

the function just needs to return a list of the keys in the pipfiles [packages] section. I imagine you could achieve this functionality already using a helper function, but it'd be nice if it was part of pipenv so we don't all have to implement it.

@uranusjr
Copy link
Member

uranusjr commented Jan 8, 2018

Pipfile, the implementation backing Pipenv’s Pipfile parsing, can help with this:

import pipfile
pf = pipfile.load('LOCATION_OF_PIPFILE')
print(pf.data['default'])

But I wouldn’t recommend this, or depending on Pipenv in setup.py. Importing pipenv (or pipfile) means the user needs to actually install that before trying installing your package, and tools like Pipenv trying to peek into it without installing (setup.py egg_info) won’t work. The setup.py should only depend on Setuptools.

A middle ground solution would be to write a tool similar to bumpversion that automatically syncs a text file based on Pipfile. Distribute this file with your package, and read it in setup.py. Then use CI or a commit hook to make sure the files are always in sync.

@Place1
Copy link

Place1 commented Jan 8, 2018 via email

@Korijn
Copy link
Author

Korijn commented Jan 8, 2018

@uranusjr just testing my assumptions here, but wouldn't it be possible to add pipenv to setup.py's setup_requires, and delaying the pipenv import to a setuptools Command? Or is that considered bad practise?

@uranusjr
Copy link
Member

uranusjr commented Jan 8, 2018

@Korijn It might not be per se, but given the current best practice is to use separate virtualenvs for each Python project, this would require the user to install a copy of Pipenv for each project, which is not very intuitive. Pipenv should only be installed once (usually globally), and is used outside the project’s virtualenv to manage it, not inside the project’s virtualenv.

@Californian
Copy link

So what's the resolution to this that led to the issue's closure? Is there no means of keeping track of both the dependencies in the Pipfile and in setup.py? Is there a best practice that circumvents the issue?

@Korijn
Copy link
Author

Korijn commented Feb 2, 2018

For applications that are deployed or distributed in installers, I just use Pipfile.

For applications that are distributed as packages with setup.py, I put all my dependencies in install_requires.

Then I make my Pipfile depend on setup.py by running pipenv install '-e .'.

[Update 2019-08-23] I keep the dev packages in Pipfile nowadays, only the runtime dependencies get to live in setup.py.

@uranusjr
Copy link
Member

uranusjr commented Feb 2, 2018

I thiink @Korijn’s approach is best practice here. Pipfile (and requirements.txt) is for applications; setup.py is for packages. They serve different purposes. If you need to sync them, you’re doing it wrong (IMO).

@vascowhite
Copy link

vascowhite commented Feb 3, 2018

@uranusjr Not according to the documentation.

Pipenv is a tool that aims to bring the best of all packaging worlds (bundler, composer, npm, cargo, yarn, etc.) to the Python world. Windows is a first–class citizen, in our world.

It automatically creates and manages a virtualenv for your projects, as well as adds/removes packages from your Pipfile as you install/uninstall packages. It also generates the ever–important Pipfile.lock, which is used to produce deterministic builds.

Maybe I'm just not getting it. Could you elaborate on your statement please?

The way I understood it is that pipenv is a complete dependency management system similar to composer for PHP, but I'm beginning to realise that it isn't. Especially as pipenv won't install the dependencies of a dependency that has a Pipfile and Pipfile.lock, but no install_requirements in setup.py.

@techalchemy
Copy link
Member

@vascowhite the question you’re asking isn’t about pipenv but rather is about a fundamental separation between python packaging tools. In the python workflow, setup.py files are used to declare installation dependencies of a distributable package. So, if I have a package like requests, and it depends on people having cffi installed, I would declare that in my setup.py so that when people run pip install requests it will perform that install if necessary as well.

Pipfiles, like requirements files, are not meant to be traversed recursively. Instead there is a single pipfile which rules over all of the dependencies for a project you might be developing. The point of this is that the old workflow generated a flattened list of pinned requirements, while Pipfiles contain top level requirements and prefer unpinned where possible. When you install a package, the requirements from it’s setup.py are recursively resolved to the best match that also fits your other requirements.

So if you want to know why Pipfiles aren’t recursively resolved, it’s because that’s just not how they are used in python. Running pipenv install ultimately requires a target that is installable by setuptools, which means it will have its install requirements defined in its setup file.

@uranusjr
Copy link
Member

uranusjr commented Feb 3, 2018

@techalchemy I was half-way through a similar response before yours popping up 😂 (delete everything)

I would like to also note that @vascowhite what you’re asking is not in fact outlandish. With Pipfile and the lock file both being available, it is possible to reconcile the two distinct workflows. In an ideal world, Pipfile replaces setup.py’s install_requires, and be used to specify virtual dependencies, while the lock file is used to produce a concrete dependency set based on it, replacing requirements.txt.

Python’s packaging system, however, is far from ideal at the present time, and it would require a lot of cleanup before this can ever happen. Heck, Pipenv is already having difficulties handling dependencies right now (p.s. not anyone’s fault), it would probably barely work unless for the simplest of projects if used like that.

The hope is not lost though (at least not mine). There’s been a lot of PEP being proposed and implemented around this issue, and I feel things are on the right track with setup.py and requirements.txt both moving toward a rigid, declarative format. With an ecosystem so large, things need to move slowly (or see Python 3.0), but are indeed moving.

@vascowhite
Copy link

vascowhite commented Feb 3, 2018

@techalchemy @uranusjr
Thank you both for your clear answers, they do straighten a few things in my mind. It does seem to me that the documentation is over stating what Pipenv is able to do and that is partly the cause of my confusion. The majority of my confusion is down to me though :)

Having come from PHP I have been confused by packaging in python, Composer is a breeze in comparison. I do find python much easier to develop in and love using it. Let's hope things improve, I'm sure they will given the efforts of people like yourselves and Kenneth Reitz.

@Korijn
Copy link
Author

Korijn commented Feb 3, 2018

If you stick to my advice mentioned above, you can perfectly harmonize both setup.py and pipenv. No need to get all fussy. :)

@vascowhite
Copy link

looks like I'm not the only one that's confused #1398

Put much better than I could though :)

@apiraino
Copy link

apiraino commented Feb 8, 2018

Came here for info on using pipenv with a setup.py; adding my .2 cents to the discussion.

I have a python package which setup.py looks like:

 setup(                                                                                                     
    name='my-pkg-name',                                                                             
    packages=find_packages(),                                                                              
    install_requires=[...],
    extras_require={
        'develop': ['click']
    },
    entry_points={
        'console_scripts': [
            'my-pkg-name-cmdline = my-pkg-name.cli:tool'
        ]
    }

As you can see I use click in the script entrypoint for tasks such as building and deployment.

When I run $ my-pkg-name-cmdline build I don't find click installed, because pipenv install --dev installs packages in the pipenv virtualenv. I need to fiddle with pipenv shell/exit in order to make it work. Looks like there are still some rough edges on this.

Therefore +1 for not using pipenv for packages.

@Korijn
Copy link
Author

Korijn commented Feb 8, 2018

I think you are expected to call pipenv run my-pkg-name-cmdline build in that scenario.

@apiraino
Copy link

apiraino commented Feb 8, 2018

@Korijn I'm still not sure about the correct workflow (still experimenting a bit with pipenv).

As of yet, the workflow that seems to be working for me is:

(starting from scratch)
1- pipenv --three
2- pipenv install [--dev]
3- pipenv install -e . (install application locally)
4- pipenv shell (to enter the virtualenv)

Now I can run my package build click script from the command line.

if I enter into the virtualenv (step 4) before installing the application locally (step 3), it does not work.

Perhaps I just have to rewire my brain into remembering that packages should be installed before pipenv shell (while using virtualenv requires you to install packages with the virtualenv activated).

@uranusjr
Copy link
Member

uranusjr commented Feb 8, 2018

@apiraino I think you’re not getting thing right here. If you want to use (import) click in your package, you should put it in install_requires instead, so people (including yourself) installing your package can have click installed as well. Putting it in extras_require['dev'] means it’s an optional dependency, i.e. your package can work without it, but installing those extras can provide certain extra features.

This discussion really does not have anything to do with Pipenv anymore. I suggest you bring this problem to a more suitable forum, such as StackOverflow or Python-related mailing lists.

@benjaminweb
Copy link

benjaminweb commented Feb 12, 2018

@Korijn pipenv install '-e .' yields a Pipfilenot reflecting the modules listed under install_requires in setup.py

This is still the case for pipenv 9.0.3.

How can I generate Pipfile from my setup.py's install_requires?

@techalchemy
Copy link
Member

dont use quotation marks

@benjaminweb
Copy link

I stopped using quotation marks. However, I don't get a Pipfile created that includes the deps from install_requires section of setup.py.

@cdwilson
Copy link

@benjaminweb I was confused by the same thing today. However, I'm starting to think that the current behavior may correct.

@techalchemy mentioned above that

Pipfiles contain top level requirements and prefer unpinned where possible. When you install a package, the requirements from it’s setup.py are recursively resolved to the best match that also fits your other requirements.

If you use the workflow mentioned in #1263 (comment), when you run pipenv install '-e .' on a project without an existing Pipfile, pipenv generates a new Pipfile with the following:

[packages]

"e1839a8" = {path = ".", editable = true}

In this case, the only package you explicitly requested to be installed into the virtualenv is the package itself (i.e. "."), so it makes sense that only "." is added to the ([packages]) in the Pipfile. Similarly, if you pipenv install requests, none of the install_requires depencencies from requests's setup.py are added to your project's Pipfile either.

However, when the package installation step happens next, the install_requires dependencies will be installed as part of the dependency resolution for the package.

Note that unlike the Pipfile, the Pipfile.lock records all the exact dependencies for the entire virtualenv, which has to include the install_requires dependencies locked to specific versions. If you look at the Pipfile.lock that's generated, you'll see the install_requires dependencies listed.

It's possible I'm totally misunderstanding how this is expected to work. Maybe @techalchemy or @uranusjr can confirm if this is the correct way of thinking about this?

@uranusjr
Copy link
Member

Your line of thinking matches mine. I’ll also mention that with recent Setuptools advancement and tools such as Flit you can still specify your package’s dependencies in nice TOML form (instead of requirement strings in setup.py, which is admittedly not very pretty). You just specify them in pyproject.toml instead Pipfile.

@cdwilson
Copy link

cdwilson commented Feb 24, 2018

@uranusjr it sounds like what you're saying is that Pipfile only needs to explicitly list project dependencies if they are not already being captured by a packaging tool like Setuptools or Flit (via setup.py or pyproject.toml)

For example, if setup.py looks like:

install_requires=['A'],
extras_require={
    'dev': ['B'],
},

Then the Pipfile only needs the following:

[[source]]

url = "https://pypi.python.org/simple"
verify_ssl = true
name = "pypi"


[packages]

"e1839a8" = {path = ".", editable = true}


[dev-packages]

"e1839a8" = {path = ".", extras = ["dev"], editable = true}

Running pipenv install would install dependency A for production, and pipenv install --dev would install dependencies A & B for development.

If someone is already using Setuptools or Flit, is there ever any reason why dependencies should be added into the Pipfile under [packages] or [dev-packages]? Looking at Requests as an example, it's not obvious to me why the development dependencies are listed explicitly in Pipfile under [dev-packages], but theinstall_requires and test_requirements dependencies are all captured in setup.py.

It seems like the only reason why you would need to add dependencies explicitly to Pipfile is if you're not using Setuptools or Flit. Is this correct? Are there reasons why this is not true?

@uranusjr
Copy link
Member

I think it's just personal preference. Listing dev dependencies in extras_require[dev] is merely a convention; dev-packages OTHO is a well-defined key. extras_require[dev] would also enable any user to pip install package[dev], which maintainers may not like. I can understand people preferring one of the other.

As for packages, no, there really isn't a scenario it makes more sense than install_requires IMO. I'm sure people will come up with creative usages though.

keitheis added a commit to iCHEF/queryfilter that referenced this issue Feb 26, 2018
@JulienPalard
Copy link

Why is this issue closed?

@techalchemy
Copy link
Member

This is not a bug, you cannot use the same base entry multiple times in a pipfile. If you specify a dependency in the dev section and also in the default section, the default section takes precedence no matter what.

I would walk through my normal thought experiment but I don’t have time just now so just take my word for it that it could cause dependency conflicts and surprises when you deploy something and find out your dev dependency was hiding a conflict.

@rubenbonilla
Copy link

@techalchemy so how can I manage my dev dependencies in this case? I only want to know how to use pipenv in a good way

@uranusjr
Copy link
Member

I’ve been thinking about this for my own project, and kind of came to realise I don’t really need the packages/dev-packages distinction. How about listing {editable = true, extras = ["dev"], path = "."} in packages.

abravalheri added a commit to abravalheri/dummy-pipenv-example-for-pyscaffold that referenced this issue Jun 18, 2018
Use the `pipenv install -e .` strategy to include setuptools
`install_requires` dependencies in Pipenv.

This way abstracted requirements are expressed via `setup.cfg` while
concrete dependencies are expressed via `Pipfile.lock`.

Since extra requirements are not installed for editable dependencies
(until this moment), `testing` dependencies are handled exclusively
inside tox/pytest-runner venvs, and `dev` dependencies should be
specified directly in the Pipenv file (not included in `setup.cfg`).

ref: pypa/pipenv#1094 (comment)
     pypa/pipenv#1263 (comment)

Basic workflow:
- Add abstract dependencies to `setup.cfg`
- Proxy `setup.cfg` by doing `pipenv install -e .`
- Add dev dependencies by doing `pipenv install -d XXXX`
- Use `pipenv update -d` to compile concrete dependencies (and install
  them in a virtualenv)
- Add `Pipfile.lock` to source control for repeatable installations:
  https://caremad.io/posts/2013/07/setup-vs-requirement/
- Use `pipenv run` to run commands inside the venv (e.g. `pipenv run
  tox`)
- Don't expose test requirements directly to pip-tools. Instead, just
  rely on tox/pytest-runner to install them inside the test venv.
woky pushed a commit to rchain/pyrchain that referenced this issue Jul 9, 2019
woky pushed a commit to rchain/pyrchain that referenced this issue Jul 9, 2019
@Madoshakalaka
Copy link

Madoshakalaka commented Aug 23, 2019

check out this pipenv-setup package

It syncs pipfile/lockfile to setup.py

$ pipenv-setup sync

package e1839a8 is local, omitted in setup.py
setup.py successfully updated
23 packages from Pipfile.lock synced to setup.py

you can do $ pipenv-setup sync --dev to sync development dependencies to extra requires. or $ pipenv-setup sync --pipfile to sync pipfile instead

and $ pipenv-setup check to do checks only

one command to solve them all 💯

@peterdeme
Copy link

Is there any plan to merge pipenv-setup package to pipenv?

@Madoshakalaka
Copy link

@peterdeme

Is there any plan to merge pipenv-setup package to pipenv?

@uranusjr @techalchemy based on the discussion above, I think pipenv might have a somewhat different philosophy. But If the maintainers agrees, I'd very like to submit a pull request and try to integrate pipenv-setup

@Kilo59
Copy link

Kilo59 commented Nov 19, 2019

You can always parse the Pipfile.lock with the builtin json module. Extract the non-dev dependencies for your setup.py install_requires.
The "default" key contains nested "dicts" of the package name along with version numbers and markers.
You don't need to rely on any external imports.

@Madoshakalaka
Copy link

Madoshakalaka commented Nov 19, 2019

@Kilo59 I've seen people doing this. A tip to mention is don't forget to include Pipfile.lock as data_file in setup.py (or include it in MANIFEST.in). And that's for lockfile with pinned dependencies. pipfile, on the other hand, is non-trivial to parse, if you want semantic versioning in Pipfile. The same dependency requirement can show in multiple forms.

@flipbit03
Copy link

Thank you @Madoshakalaka your tool works nicely!

I agree with other peers that Setup.py's dependencies are different from Pipfile's project dependencies. But still, having a programmable way to sync those without manual labor is a great time saving feature. Also, avoids typos/common errors.

The blackened setup.py was a nice touch too 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests