Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add an easy way to run pytest -Werror::DeprecationWarning -Werror::FutureWarning #13396

Open
amueller opened this issue Mar 5, 2019 · 11 comments
Labels
Needs Decision - Close Requires decision for closing

Comments

@amueller
Copy link
Member

amueller commented Mar 5, 2019

CI is erroring on uncaught deprecation and future warnings but that's hard to reproduce locally unless you really know what you're doing.

I feel like we should make this easier, or maybe even the default when running tests locally?

One way would be to do this in the makefile, which would only help if using make, but that already would be better, I think.

We could also add it to the configuration, not sure if that's overkill?

It's a bit annoying to have tests pass locally but then not on CI because it uses different flags [yes yes, I added these flags, I know].

@jnothman
Copy link
Member

jnothman commented Mar 7, 2019 via email

@amueller
Copy link
Member Author

Maybe I just wanted to be conservative? Actually I agree with you, we should always run it.

@rth
Copy link
Member

rth commented Mar 18, 2019

If our CI runs with warnings as errors, why is it overkill to do this in setup.cfg?

Because our CI runs with either fixed versions or the latest versions of dependencies. Developers and people packaging scikit-learn for Linux distributions can run it with arbitrary versions of dependencies where we cannot guarantee that some warning won't occur (particularly for not yet released versions of numpy/scipy). In those conditions, this option will lead to unwanted test failures.

See for instance #12043

So I don't think having it as default is a good idea, however +1 adding a new entry (non default) to the makefile if you find it useful.

@NicolasHug
Copy link
Member

A makefile entry would make passing pytest parameters like -k or -s a bit cumbersome.

Is there a way to easily bypass the flags in setup.cfg that would make the lives of those that build sklearn for arbitrary version easier?

@amueller
Copy link
Member Author

@rth lol ok yes, I was opposed to this change ;) but clearly didn't see the ping so that's on me. And I agree your reasoning is sound but the change makes running the tests locally too hard.

could we by default raise on deprecation warnings only from within sklearn?

I think it's weird to have something pass locally and then fail on the CI with the same versions.

@rth
Copy link
Member

rth commented Mar 18, 2019

A makefile entry would make passing pytest parameters like -k or -s a bit cumbersome.

Yeah, that's the reason I'm not using the makefile for testing :)

Is there a way to easily bypass the flags in setup.cfg that would make the lives of those that build sklearn for arbitrary version easier?

Last time I checked it was non-trivial, at least much harder than providing a CLI parameter to pytest.

My point that one shouldn't have to bypass the flags in setup.cfg. pytest sklearn should run the tests suite (even without reading the docs) and fail only if something is broken in scikit-learn. When we handle warnings as errors, we don't test scikit-learn alone but also that none of the dependencies raise any warnings (including their not yet released versions) which is outside of the scope of unit tests IMO.

For instance, I would consider a bug, a situation where tests pass for some scikit-learn release; then start failing few month later because of a new scipy release that's perfectly backward compatible but adds a new harmless deprecation warning.

An option could be to suggest setting the PYTEST_ADDOPTS environement variable for developpers prefering this default.

@amueller
Copy link
Member Author

@rth my concern is more about deprecation warnings within sklearn which couldn't raise these spurious warnings. If I deprecate something in sklearn I should catch or change all occurrences. Right now the tests don't force me to do that locally, but they do on the CI.

I think warnings caused by deprecations within sklearn are much more common than from scipy or numpy. If we can't filter these, then I guess another option would be to not use DeprecationWarning but only error on our own warning that inherits from DeprecationWarning?

@rth
Copy link
Member

rth commented Mar 26, 2019

I think warnings caused by deprecations within sklearn are much more common than from scipy or numpy.

Yeah, but even a single deprecation warning from numpy / scipy / pandas breaking tests post release (for the sake of development convenience) is not OK IMO. The latest was the deprecation of scipy.sparse matrices not that long ago.

If we can't filter these, then I guess another option would be to not use DeprecationWarning but only error on our own warning that inherits from DeprecationWarning?

We could if you think it's worth the effort.

If I deprecate something in sklearn I should catch or change all occurrences. Right now the tests don't force me to do that locally, but they do on the CI.

Wouldn't setting the PYTEST_ADDOPTS variable locally solve that?

To summarize, I still have concerns about changing setup.cfg to error on warnings for all versions of dependencies, but if others think it's a good idea I won't argue against it. +0 overall.

@amueller
Copy link
Member Author

@rth I agree they shouldn't lead to errors post-release.

@cmarmo
Copy link
Member

cmarmo commented Apr 14, 2022

The documentation has been updated in #13540, and the discussion about forcing erroring on warnings locally doesn't seem to reach consensus. Might this one be closed?

@thomasjpfan
Copy link
Member

thomasjpfan commented Apr 15, 2022

Thank you @cmarmo for bringing this issue back up. Looking at this again, I think we can configure setup.cfg to only error for warnings from sklearn itself. I opened #23143 to resolve this issue.

Edit: Does not work.

@thomasjpfan thomasjpfan added Needs Decision - Close Requires decision for closing and removed Needs Decision - Close Requires decision for closing labels Apr 15, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Needs Decision - Close Requires decision for closing
Projects
None yet
6 participants