Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Goal: ensure lock file consistency after all installation commands #1255

Closed
8 tasks done
ncoghlan opened this issue Dec 31, 2017 · 50 comments
Closed
8 tasks done

Goal: ensure lock file consistency after all installation commands #1255

ncoghlan opened this issue Dec 31, 2017 · 50 comments
Labels
Category: Future Issue is planned for the future. Type: Discussion This issue is open for discussion. Type: Enhancement 💡 This is a feature or enhancement request.

Comments

@ncoghlan
Copy link
Member

ncoghlan commented Dec 31, 2017

As noted in #1137, pipenv doesn't currently check for requirement conflicts between [packages] and [dev-packages], which can cause problems when attempting to generate a flat requirements.txt that covers both, as well as causing general weirdness when dependencies conflict.

The way pipenv install works (running the installation and then updating the lock file) means it is also relatively easy for the local environment to get out of sync with the lock file: if the installation request covers an already installed package, then it won't be updated locally, but the lock file will be updated to the latest version available from any configured package indices.

This issue doesn't cover an individual feature request. Instead, it establishes the goal of helping to ensure consistency between the lock file and the local environment by structuring installation commands to modify the lock file first, and then use the updated lock file to drive the actual installation step. (We don't have any kind of target ETA for how long this work will take - writing it down is just intended to help ensure we're all heading in the same direction, and are all comfortable with that direction)

A key aspect of this is going to be clarifying the division of responsibilities between pipenv install, pipenv uninstall, pipenv lock, and pipenv update (and potentially adding one or more new subcommands if deemed necessary).

Proposed substeps:

  • add a pipenv sync subcommand that ensures that the current environment matches the lock file (akin to pip-sync in pip-tools). (Implemented by @kennethreitz for 10.0.0, with pipenv sync ensuring versions of locked packages match the lock file, while pipenv clean removes packages that aren't in the lock file. I like that change relative to pip-sync, as it means that including an implicit sync in another command won't unexpectedly delete anything from the virtual environment)
  • Keep the current behaviour for pipenv install with no arguments, but switch to recommending the use of pipenv sync when setting up a fresh environment (that way pipenv install is used mainly for changing the installed components)
  • update pipenv lock to always ensure that [packages] and [dev-packages] are consistent with each other (resolution for Pipenv ignores conflicts between default and development dependencies #1137)
  • change pipenv update to be equivalent to pipenv lock && pipenv sync (pipenv update is instead being removed entirely - it's old behaviour was actually comparable to what is now pipenv sync && pipenv clean)
  • add a new pipenv lock --keep-outdated option that still generates a fresh Pipfile.lock from Pipfile, but minimises the changes made to only those needed to satisfy any changes made to Pipfile (whether that's package additions, removals, or changes to version constraints)
  • change pipenv install <packages> to be equivalent to "add or update entries in Pipfile" followed by pipenv lock && pipenv sync (implemented in Change pipenv install to update the lock file first #1486).
  • Add a --keep-outdated option to pipenv install that it passes through to the pipenv lock operation.
  • add a new pipenv install --selective-upgrade <packages> feature that's semantically equivalent to "remove those package entries from Pipfile (if present)", run pipenv lock --keep-outdated, then run pipenv install --keep-outdated <packages> (this is the final step that delivers support for Updating only one locked dependency #966). If just a package name is given, then the existing version constraint in Pipfile is used, otherwise the given version constraint overwrites the existing one. The effect of this on the current environment should be the same as pip install --upgrade --upgrade-strategy=only-if-needed <packages> in pip 9+ (except that Pipfile.lock will also be updated).

If anyone wants to pick up one of these substeps, make a comment below to say you're working on it (to attempt to minimise duplication of effort), then file a PR linking back to this issue. (Steps listed earlier will need to be completed before later steps are practical)

Open questions:

  • None

Resolved questions (at least for the immediate future):

  • pipenv lock will continue to default to upgrading everything by default, with pipenv lock --keep-outdated to request a minimal update that only adjusts the lock file to account for Pipfile changes (additions, removals, and changes to version constraints)
  • pipenv lock --skip-lock will continue to work as it does today (even though it means the lock file and the local environment can get out of sync: use pipenv sync && pipenv clean to resync them)
  • pipenv install and pipenv install <package> will continue to imply a full pipenv lock by default, with pipenv install --keep-outdated needed to request only the minimal changes required to satisfy the installation request
  • pipenv install <package> will continue to retain the existing version constraint in Pipfile if none is given on the command line, even for the new --selective-upgrade option
  • pipenv uninstall <package> will just remove the specified package[s], and hence may leave no longer needed dependencies in the local environment. Running pipenv lock && pipenv sync && pipenv clean will clear them out.

Note: the original proposal here was just to ensure that [dev-packages] and [packages] were kept in sync when generating the lock file, and the first few posts reflect that. The current proposal instead covers the lock file driven symmetric update proposal first mentioned in #1255 (comment)

@ncoghlan
Copy link
Member Author

Noting a potential implementation challenge here: I'm not sure that pip-tools currently supports constraints files, in which case we'd need to add that support first in order to gain the full benefit of this approach.

However, install-time conflict detection should be possible regardless, since pip is used to handle the actual installs.

@taion
Copy link

taion commented Dec 31, 2017

You don't need to wait until install time per se. You can do this when you resolve the full transitive dependencies for the lock file.

You can mostly do this with LocalRequirementsRepository per @vphilippon's suggestion for not updating unrelated dependencies in #966 (comment). It looks sort of like https://github.com/taion/pipf/blob/1feee35a2e4480bc7e5b53bfab17587d37bdf9dd/pipf/pipfile.py#L175-L185.

See also pypa/pipfile#100. It is still a root issue that Pipfile.lock can even represent multiple incompatible sets of dependencies for the different groups.

@ncoghlan
Copy link
Member Author

@taion We're not going to switch Pipfile.lock to a flat representation, since we want to support tools relocking the deployment dependencies without even looking at the development dependencies. Yes, that does mean the [dev-packages] section may drift out of date, but that's the purpose of this issue: ensuring that when the [dev-packages] section does get updated, it will be resync'ed with the deployment packages section.

If we can detect any conflicts at lock time instead of at install time, that will be excellent, though.

@taion
Copy link

taion commented Dec 31, 2017

In practice from a user perspective that's just a special case of #966 and other related (closed) issues.

Like worst case you just cache the resolved transitive dependencies in the lockfile. Yarn and npm already do that anyway. It'd let you update all your prod deps without hitting PyPI for the dev deps.

Unless I'm missing something there's no active benefit of letting things drift out of sync. As someone who's spent a lot of time with these package management patterns, most of these deviations from how existing tools work just seem to cause pain.

@taion
Copy link

taion commented Dec 31, 2017

I'll also add that npm does in fact support updating only production dependencies and not development ones, and it does in fact use a flat lockfile (with cached resolutions).

Interestingly enough, Yarn doesn't for its bulk "update dependencies" subcommand (though the interactive version does split prod and dev deps), and it doesn't look like anybody's complained about it, but that can probably be chalked up to the npm-alikes using caret ranges for dependencies by default, plus people generally being good enough about following SemVer to seldom cause users problems from bumping dev deps. The Python ecosystem, not so much.

@ncoghlan
Copy link
Member Author

@taion An explicit pipenv lock already fully resolves both [packages] and [dev-packages], so we shouldn't need to worry about configuration drift in that case (and that's the case where I think it's most desirable to be detecting dependency conflicts).

If there's a symmetric solution that also allows both pipenv install package and pipenv install --dev package to detect (and resolve) inconsistencies, then I agree that would be a good way to go. However, I'd also being OK with an asymmetric solution, if that was easier to implement (hence the framing of this proposal).

For the symmetric approach, I think one way of framing & implementing that would be to pursue the approach that @vphilippon described in #966 (comment), but structure it as follows:

  • add a pipenv lock --keep-outdated option to request a minimal lockfile update rather than the default comprehensive one
  • change pipenv install package to be defined as "edit Pipfile -> pipenv lock --keep-outdated -> pipenv install"

(Note: for security management reasons, I really want to keep the default behaviour of pipenv lock as upgrading everything - however, I do think a feature like --keep-outdated has a place when other mechanisms are in place to ensure timely responses to reported security issues in dependencies)

@taion
Copy link

taion commented Dec 31, 2017

The behavior you're describing for pipenv install is mostly what #966 gets at, but it's not exactly the same. Imagine you have X = "*". Doing pipenv install X should probably still upgrade X in the lock file, even though there's no change to Pipfile. Pipenv needs to keep track of what actually gets updated.

In other words you can't fully split out the "install" from the "lock". See e.g. https://github.com/taion/pipf/blob/1feee35a2e4480bc7e5b53bfab17587d37bdf9dd/pipf/pipfile.py#L150-L155 – when regenerating the lockfile, I explicitly pass in the packages that are getting installed to discard the old constraints for those.

The complaint anyway isn't that something like an explicit pipenv lock updates all dependencies to the latest available; it's that pipenv install Y touches the version for an unrelated X. In other words, people shouldn't really need to explicitly run pipenv lock --keep-outdated.

Yarn's handling here is a pretty good example in terms of CLI. Yarn has yarn lockfile, which generates a lockfile without changing anything, and is explicitly marked as "[not] necessary", and it has yarn upgrade which upgrades everything. If for some reason you do manually edit your dependencies, the next step is to just run yarn install, which both updates your lockfile (minimally) and installs the necessary packages.

I've never had to explicitly generate a lockfile outside of installing my updated dependencies, and I don't see why I would want to do so. It's not a typical action with a locking package manager, since almost any action you take to modify your installed dependencies will update the lockfile anyway. The set of user-facing interactions instead looks more like:

  1. Install all dependencies
  2. Add or upgrade one or more specific dependencies
  3. Upgrade all dependencies

(1) and (2) should always apply minimal changes to my lockfile (or generate it if necessary), while (3) should essentially disregard the existing lockfile. But for just generating a lockfile, (1) is sufficient (since it's going to be run from a dev environment where everything is installed anyway).

P.S. Yarn splits out yarn add (add a dependency) from yarn install (install all dependencies). It was sort of confusing and annoying at first, but the more I've used it, the more I like that distinction. Having a single install command that both adds specific dependencies and installs all the project dependencies is sort of weird, if you think about it. Those two actions aren't the same thing.

@ncoghlan
Copy link
Member Author

ncoghlan commented Jan 2, 2018

The first few paragraphs in https://lwn.net/Articles/711906/ do a pretty good job of describing my mindset here: "moving target" should be the default security model for all projects, and opting out of that and into the "hardened bunker" approach should be a deliberate design decision taken only after considering the trade-offs.

Hence the choice of option name as well: --keep-outdated isn't just inspired by pip list --outdated, it's also deliberately chosen to look dubious in a command line (since keeping outdated dependencies around is inherently dangerous from a security perspective).

For the specific case of pipenv install X, it should not implicitly do an upgrade, because pip install X doesn't implicitly do upgrades.

pipenv is currently inconsistent in regards to that latter point though, since it does the following:

  • calls pip install package (no implicit upgrade)
  • calls pipenv lock (implicit upgrades)

Thus the idea of ensuring consistency by always using the lock file to drive the installation (unless --skip-lock is used), and offering pipenv lock --keep-outdated to match pip's only-if-needed upgrade strategy.

@taion
Copy link

taion commented Jan 2, 2018

I think we're in agreement on the substantive points on the desirability of upgrading dependencies.

And, sure, pipenv install -U <package> to update a single package would be better to be parallel with Pip.

I am saying, though, in a normal workflow, users have no reason to run pipenv lock, with or without --keep-outdated or whatever. The sets of commands a user might run for various scenarios are:

  • Bootstrapping brand new package: pipenv init
  • Setting up new clone of project: pipenv install
  • Adding a new dependency: pipenv install <package> (and note again that Yarn calls this add, which is a good idea)
  • Re-sync installed dependencies after manually editing Pipfile: pipenv install
  • Upgrading all dependencies to latest compatible versions: pipenv update

All the above actions keep Pipfile.lock in sync.

Or, to put it another way, I've never had to run npm shrinkwrap (with npm 5), and I've never had to run yarn lockfile.

(And the reason it's not so useful to just generate a lockfile with updated dependencies is because the very next step in a dev workflow is to reinstall the updated dependencies, then run the test suite.)

@greysteil
Copy link
Contributor

FWIW, I run a tool which creates dependency update PRs automatically (Dependabot), including for Pipfiles, and I would love a pipenv lock --keep-outdated option. At the moment we edit Pipfiles to lock down every dependency except the one we're updating when we create a PR.

In my experience, PRs to update a single dependency are much more likely to be merged than ones that update many at once. Generating them automatically helps ensure the moving target keeps moving.

@ncoghlan
Copy link
Member Author

ncoghlan commented Jan 2, 2018

Right, and PR generation is where I think pipenv lock --keep-outdated will be useful: it assumes the "pipenv install && run the tests" step will happen remotely on the CI server, rather than on the machine where the PR was generated. Similarly, pipenv lock is for the case where you're aiming to generate a batch update PR that you're going to test in CI, without worrying overly much about applying the change locally.

@ncoghlan
Copy link
Member Author

ncoghlan commented Jan 2, 2018

In that vein, if pipenv install <requirement> were to become equivalent to "update Pipfile + pipenv lock --keep-outdated + pipenv install", then it would make sense for pipenv update to become equivalent to pipenv lock && pipenv install.

While yarn makes a distinction between yarn install and yarn add, if we were to ever add such a distinction to pipenv, I'd be more inclined to borrow from the pip-tools terminology and add a pipenv sync command to say "Make the current environment match the lock file". However, doing that's out of scope for this proposal: this should focus specifically on eliminating the opportunity for discrepancies to arise between deployment dependencies and development dependencies.

@taion
Copy link

taion commented Jan 2, 2018

@greysteil

The equivalent for the Greenkeeper/&c. use case with npm is just the following, right?

$ npm install --package-lock-only <package>@latest

In fact does npm5 even expose a way to just create a package-lock.json outside of npm install?

@ncoghlan

Greenkeeper-like services are very cool and worth using, but it'd be odd to consider that a core use case (and as noted above, even that case of "update a single dependency and the lockfile, but don't install things" admits a cleaner expression).

@taion
Copy link

taion commented Jan 2, 2018

Oops, I hit enter too soon. Outside of Greenkeeper-style workflows, it's almost vanishingly uncommon to just bump my dependencies without syncing them locally. It only makes sense if you somehow don't have a local copy of the deps installed. And again PR-only services are an exception, but only that.

The description here is "Python Development Workflow for Humans", and as such it's worth keeping in mind what actual development workflows look like. And we have a pretty good existence proof that explicitly running the "generate lockfile" operation is almost never necessary, so it'd be worthwhile to not think too much in terms of those.


The specific issue here, BTW, is that Pipenv uses * ranges by default. If I do pipenv install flask, I get flask = "*" in my Pipfile. That means that, if I want to upgrade my installed version of Flask in a clean way, that sort of wants to be a lockfile-only change.

An upgrade operation could in principle switch a * requirement to a >= range or something in Pipfile, but that creates a weird asymmetry between "upgrade" and "initial install". Arguably the best resolution here might be to use >= ranges for initial installs anyway. At least, I'm not sure I can see any good reason for using * ranges there.

@greysteil
Copy link
Contributor

@taion - yep, agreed, not expecting you guys to keep us / other dependency update tools in mind too much, although very grateful if you do! I only mentioned it because I know I would have used pipenv lock --keep-outdated manually if I'd been updating dependencies at my previous workplace (we had a one PR per dependency update policy).

For reference, npm added a package-lock-only option in the latest release (changelog).

@taion
Copy link

taion commented Jan 3, 2018

@greysteil What do you do if someone has a * range in Pipfile, though? Like in my flask = "*" example above. As-is, given that this is how Pipenv adds dependencies by default, you need to know which requirement to un-pin when rebuilding the lockfile, no?

@greysteil
Copy link
Contributor

greysteil commented Jan 3, 2018

I don't understand the working of Pipenv in anywhere near the detail that others on this thread do, and don't want to take it off-topic. If the below is useful, great. If it's not, sorry!

We use some nasty hax:

  • To find the version to update to, we currently just get the latest version (we should use the resolver in pip-tool, but don't yet. Turns out that's not so bad as the next step will error if it's unresolvable)
  • To produce a Pipfile.lock that only updates that (top-level) dependency, we
    1. Update the Pipfile entry for the dependency we're updating to support the latest version
    2. Parse the existing lockfile
    3. Update the Pipfile to set the requirement on every other dependency to what it resolved to in the lockfile
    4. Run pipenv lock to generate a new lockfile
    5. Update the hash in the newly generated lockfile to be what it would be if we hadn't done step 3 (a horrible hack that I'd really like to be able to remove)

End result is that the user gets a PR with their Pipfile unchanged (if the requirement was a *) but their Pipfile.lock updated to the use the latest version of that dependency.

As I understand it, pipenv lock --keep-outdated would allow me to use the following flow instead:

  1. Update the requirement on the dependency we're updating to equal the latest version
  2. Run pipenv lock --keep-outdated to generate a new lockfile
  3. Update the hash in the newly generated lockfile to be what it would be if we had only updated the Pipfile to support the latest version

So, that would be a bunch better. Ideal would be to be able to run pipenv lock --keep-outdated <some_package_name>, but beggars can't be choosers! :octocat:

@taion
Copy link

taion commented Jan 3, 2018

Suppose my Pipfile has:

flask = "*"
numpy = "*"

flask transitively depends on werkzeug. Suppose there are upgrades available for flask and numpy, and that I want to upgrade my version of flask, but not upgrade my version of numpy.

If I weren't using Pipenv, I'd just run:

$ pip install -U flask
$ pip freeze > requirements.txt

If I were using pip-tools, I'd run:

$ pip-compile -P flask
$ pip-sync # Or: pip install -r requirements.txt

Well, just doing some hypothetical pipenv lock --keep-outdated wouldn't do anything. On the other hand, fully rebuilding the lockfile would upgrade NumPy as well.

But unless I'm misunderstanding, the strategy you describe above would only update the version of flask in the lockfile, but not the version of werkzeug. This might then be an invalid set of dependencies, if the new version of flask also requires a new version of werkzeug.

As such, for this case, which I think is a very common use case for humans upgrading dependencies, there should be some way to upgrade a single package and its transitive dependencies (if needed), but not anything else.

The problem here specifically is the * dependencies. The way npm handles this is to start with something like:

flask = ">=0.12.1"

And then on requesting an upgrade, bump that to:

flask = ">=0.12.2"

In which case the strategy above of running something like a pipenv lock --keep-outdated would do the right thing. But if you're using a * range, then I don't see a real way to do this without some special tooling like pipenv install -U flask, that then drops flask specifically as a suggested pin when rebuilding the lockfile.

@greysteil
Copy link
Contributor

Sorry, I wasn't clear enough about step (iii) above. We lock all of the top-level dependencies to the version in the lockfile, but leave the sub-dependencies unlocked.

@taion
Copy link

taion commented Jan 3, 2018

@greysteil I see – so this can potentially upgrade transitive dependencies of other direct dependencies?

@greysteil
Copy link
Contributor

Yep - theory is that if your top-level dependencies are specifying their sub-dependencies properly (i.e., pessimistically) then updating your sub-dependencies shouldn’t be dangerous.

(I’d rather the behaviour was closer to Ruby’s Bundler, where only sub-dependencies of the dependency being updated can also be updated, but strong-arming pipenv to do that didn’t seem worth the hackery.)

@ncoghlan
Copy link
Member Author

ncoghlan commented Jan 5, 2018

@taion By design, pipenv focuses almost entirely on the "moving target" security model I described in my LCA talk. Thus, the two intended usage models are as follows:

  • Interactive batch updates:

    • A human periodically performs some action (explicit batch update, adding a new dependency, upgrading an explicit dependency)
    • As part of these operations, the lock file and current environment are updated to the latest version of everything that satisfies the expressed constraints in Pipfile
    • Over time, either the expressed constraints are updated (and chosen dependencies are switched out) until this way of working becomes painless, or else the developers switch to the second option below
  • Automated selective updates:

    • An automated process keeps watch for new versions of dependencies and submits an issue+PR whenever they become available
    • Humans mainly set up new environments, explicitly add dependencies, and adjust the expresssed constraints on existing dependencies

Now, there are currently CLI design & implementation problems affecting both of those usage models - for interactive batch updates, there are ways for the lock file and the local environment to get out of sync, and for the automated selective updates, there are challenges in implementing the tools that do the selective updates, as well as in enabling the related interactive selective operations to add new dependencies and modify the constraints on existing ones.

But that core design philosophy of "Default to running the latest version of everything unless explicitly told otherwise in Pipfile" isn't going to change - we just want to add enough selective update support to better handle the case where most lock file updates are submitted by an update management service, rather than being submitted by humans.

@greysteil
Copy link
Contributor

Awesome to read that automated tools are a use case you're thinking about @ncoghlan. 🎉

If you ever want feedback on what Dependabot / others would like/need then let me know. Python's not my home language, so I've not been able to contribute to pip/pipenv yet, but I'd love to help in any way I can.

@techalchemy
Copy link
Member

@ncoghlan I am totally in agreement with you here, and I think the approach you described makes sense. Using a LocalRequirementsRepository isn't very hard to implement from a code perspective and this is in line with my thinking about the project as well, and this discussion has been highly productive

@ncoghlan
Copy link
Member Author

ncoghlan commented Jan 6, 2018

@techalchemy OK, I'll revise the initial post to cover the symmetric proposal (i.e. handling installation of new packages via pipenv lock --keep-outdated + pipenv install --ignore-pipfile, rather than running the installation first the way we do now).

That will still leave us with several open questions (like how --skip-lock should work in that model), but it should be enough to start breaking out individual feature requests (like adding --keep-outdated to pipenv lock for requesting the only-if-needed upgrade strategy)

@ncoghlan ncoghlan changed the title Proposal: treat locked [packages] versions as constraints for [dev-packages] installations Goal: by default, ensure lock file consistency after all installation commands Jan 6, 2018
@ncoghlan ncoghlan changed the title Goal: by default, ensure lock file consistency after all installation commands Goal: ensure lock file consistency after all installation commands Jan 6, 2018
@ncoghlan
Copy link
Member Author

ncoghlan commented Jan 6, 2018

OK, I've updated the initial post with some proposed substeps that will allow us to reach a state where it's genuinely difficult to get the lock file and the local environment out of sync.

Of the proposed substeps, I think the pipenv sync subcommand and the pipenv lock --keep-outdated option are already clear enough for interested folks to look at implementing them, but the idea of redefining the semantics of all of the other operations that may install or update packages in terms of Pipfile edits and the behaviour of pipenv lock and pipenv sync likely requires further discussion.

@matthijskooijman
Copy link

Are the proposed changes still expected to fix the discrepancies between [packages] and [dev-packages]? In particular, I was running into #1220, which refers to this issue, but I can't see how the proposed changes are going to fix that inconsistency. Or is the dev/non-dev inconsistency different from the one described in #1220 and should that issue be separately investigated?

@uranusjr
Copy link
Member

uranusjr commented Feb 21, 2018

@matthijskooijman It is not explicitly stated in the top post, but #1255 (comment) (which is mentioned in the post as current) does state that there needs to either be a way to “detect (and resolve) inconsistencies” in packages and dev-packages, or an alternative is needed. This to me implies that the inconsistency will be resolved.

@ncoghlan
Copy link
Member Author

ncoghlan commented Feb 22, 2018

The intent is for that inconsistency to get resolved at the pipenv lock step (pipenv sync will then inherit the self-consistent lock file)

(Note: this wasn't clear previously, so I've added a separate bullet point calling that step out)

@joshfriend
Copy link
Contributor

I just noticed that pipenv does not regenerate the lockfile when you change the underlying python implementation between CPython and PyPy.

  1. Create and install venv with pipenv install --python pypy
  2. delete the venv
  3. Create and install venv again with pipenv install
  4. The lockfile is not regenerated and the host-environment-markers section will still contain "platform_python_implementation": "PyPy"

@ncoghlan
Copy link
Member Author

@joshfriend That's actually intentional, but see #857 (comment) for some comments on why it's currently problematic.

@kennethreitz
Copy link
Contributor

kennethreitz commented Feb 24, 2018

Keep the current behaviour for pipenv install with no arguments, but switch to recommending the use of pipenv sync when setting up a fresh environment (that way pipenv install is used mainly for changing the installed components)

I think now that install calls sync, we can still allow users to use install to boostrap new projects.

@kennethreitz
Copy link
Contributor

update pipenv lock to always ensure that [packages] and [dev-packages] are consistent with each other (resolution for #1137)

i just took a stab at this fdebdc3

it has implementation details for sure (sub deps that exist for a develop package but not a default package won't get removed, for example), but it's a definite improvement.

@kennethreitz kennethreitz added Type: Enhancement 💡 This is a feature or enhancement request. Category: Future Issue is planned for the future. Type: Discussion This issue is open for discussion. labels Feb 24, 2018
@kennethreitz
Copy link
Contributor

fixed the implementation details. daa56e1

@ncoghlan
Copy link
Member Author

@kennethreitz Good point regarding pipenv install reading fine for the fresh install case: it only reads strangely in the resyncing case, where it may end up doing upgrades and downgrades to make versions match. I've marked the bullet point about recommending pipenv sync as complete, and struck through the bits we decided not to change.

With your implementation of the #1137 changes, that just leaves the --keep-outdated and --selective-upgrade enhancements that together implement #966.

Given the chosen implementation approach for #1486, the --selective-upgrade feature is going to need to pass --upgrade-strategy=only-if-needed to the underlying pip install call in addition to setting keep_outdated when updating an out of date lock file.

@kennethreitz
Copy link
Contributor

Working on --keep-outdated now :)

kennethreitz added a commit that referenced this issue Feb 24, 2018
@kennethreitz
Copy link
Contributor

kennethreitz commented Feb 24, 2018

$ pipenv lock --keep-outdated now preserves all version numbers from the previous lockfile, unless the version numbers are pinned.

@kennethreitz
Copy link
Contributor

Also added an additional Pipfile configuration option (currently we only have allow_prereleases): keep_outdated:

[pipenv]

keep_outdated = true

@kennethreitz
Copy link
Contributor

--selective-upgrade is done.

@ncoghlan
Copy link
Member Author

Huzzah! Closing this, as any further limitations that aren't already covered by #857 can be reported as new issues after 10.1 is released :)

@kennethreitz
Copy link
Contributor

working on a bug, but will be resolved shortly

@kennethreitz
Copy link
Contributor

fixed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Category: Future Issue is planned for the future. Type: Discussion This issue is open for discussion. Type: Enhancement 💡 This is a feature or enhancement request.
Projects
None yet
Development

No branches or pull requests

8 participants