Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simplify testing code that checks the Juju version #1037

Closed
tonyandrewmeyer opened this issue Oct 10, 2023 · 5 comments
Closed

Simplify testing code that checks the Juju version #1037

tonyandrewmeyer opened this issue Oct 10, 2023 · 5 comments
Labels
feature New feature or request harness Related to ops.testing

Comments

@tonyandrewmeyer
Copy link
Contributor

Charms, such as mysql-k8s-operator, have code like:

        if ops.jujuversion.JujuVersion.from_environ().has_secrets:

in order to determine whether Juju features are available (and behave differently as a result). However, the default version for from_environ() is 0.0.0, which means that all of the has_* methods that do a version check will return False if the version isn't in the environment variables.

The Charmer can work around this, e.g.:

@pytest.fixture
def with_juju_secrets(mocker: MockerFixture):
    """Ensure that JujuVersion.has_secrets returns True."""
    mocker.patch.object(
        JujuVersion, "has_secrets", new_callable=PropertyMock
    ).return_value = True

or more indirectly by mocking the environment variable JUJU_VERSION. However, we could make this nicer for Charmers.

One possibility would be to reverse the situation so that JujuVersion.from_environ() defaults to the latest stable version if JUJU_VERSION is not in the environment variables. This more closely reflects the testing.Harness behaviour where (simulated) secret functionality is always available. In production, it's less clear, so maybe this is only done when testing?

Alternatively, or additionally, we could provide a mechanism for specifying the (simulated) Juju version, particularly if we are already going to be patching that. For example:

def test_foo(self):
    with ops.testing.juju_version("3.3.3"):
        self.harness.charm.bar()

or

def test_foo(self):
    with ops.testing.juju_features(secrets=True, open_port_on_k8s=False):
        self.harness.charm.bar()

(This might give weird behaviour if you're simulating a feature set mix that never existed in Juju).

@benhoyt
Copy link
Collaborator

benhoyt commented Oct 10, 2023

Yeah, agreed this is a good idea. We'll have to discuss the exact API. It's probably going to be simpler to use and more discoverable if we make it a Harness method, probably def set_juju_version(self, version: str) -> None.

I think we should probably leave the production 0.0.0 fallback alone (unless there's an easy way to get the actual Juju version?) but change the default for testing to the latest stable version of Juju.

@benhoyt benhoyt added the feature New feature or request label Oct 10, 2023
@tonyandrewmeyer
Copy link
Contributor Author

tonyandrewmeyer commented Oct 11, 2023

For what it's worth, if my goal was to exercise all my tests under different (simulated) Juju versions - and I wasn't using unittest - then I think I would do it all entirely with pytest. For example, with this very simplified (and contrived) test:

import pytest
import ops

def test_secret():
    if ops.JujuVersion.from_environ() < "3.0.3":
        pytest.xfail("secrets are not available in Juju 2")
    assert ops.JujuVersion.from_environ().has_secrets

I would have a conftest like this:

import pytest

@pytest.fixture(autouse=True)
def _juju_version(request, monkeypatch):
    monkeypatch.setenv("JUJU_VERSION", request.param)

def pytest_generate_tests(metafunc):
    """Duplicate all tests under multiple (simulated) Juju versions."""
    metafunc.parametrize("_juju_version", ("2.9", "3.0.0", "3.0.3", "3.1", "3.3", "3.4"), indirect=True)
pytest output
$ pytest -v
============================================================================================================================== test session starts ==============================================================================================================================
platform linux -- Python 3.10.12, pytest-7.4.2, pluggy-1.3.0 -- /home/tameyer/scratch/generate_tests_example/.venv/bin/python3
cachedir: .pytest_cache
rootdir: /home/tameyer/scratch/generate_tests_example
collected 6 items                                                                                                                                                                                                                                                               

tests/unit/test_charm.py::test_secret[2.9] XFAIL (secrets are not available in Juju 2)                                                                                                                                                                                    [ 16%]
tests/unit/test_charm.py::test_secret[3.0.0] XFAIL (secrets are not available in Juju 2)                                                                                                                                                                                  [ 33%]
tests/unit/test_charm.py::test_secret[3.0.3] PASSED                                                                                                                                                                                                                       [ 50%]
tests/unit/test_charm.py::test_secret[3.1] PASSED                                                                                                                                                                                                                         [ 66%]
tests/unit/test_charm.py::test_secret[3.3] PASSED                                                                                                                                                                                                                         [ 83%]
tests/unit/test_charm.py::test_secret[3.4] PASSED                                                                                                                                                                                                                         [100%]

========================================================================================================================= 4 passed, 2 xfailed in 0.04s ==========================================================================================================================

With a Harness method, this gets more explicit than the environment variable mock, e.g. (this assumes there's another fixture providing a harness that is used universally):

@pytest.fixture(autouse=True)
def _juju_version(request, harness):
    harness.set_juju_version(requst.param)

I think it would be nice to avoid having to do something like this:

@pytest.mark.skipif(not ops.JujuVersion.from_environ().has_secrets, reason="Checks secret-specific code")

And instead also allow getting the version, e.g.:

@pytest.mark.skipif(harness.get_juju_version().has_secrets, reason="Checks secret-specific code")

Although: (a) I'm not sure if you can use a fixture in a pytest.mark.skipif so this exact syntax might not work, (b) here the get() is returning a JujuVersion object but the set() takes a string, which is pretty awful, and (c) if there's both get and set methods probably it could be a property?

@benhoyt
Copy link
Collaborator

benhoyt commented Mar 8, 2024

@canonical/charm-tech will discuss this in person in Madrid to see if we can brainstorm a good approach.

@benhoyt
Copy link
Collaborator

benhoyt commented Mar 27, 2024

From @sed-i in a Matrix thread:

A popular charm lib, tls_certificates, started using JujuVersion.from_environ().has_secrets, and as a result, many of our harness tests now fail after fetch-lib.

Is there a harness setting to help with this? Otherwise we'd need to sprinkle a bunch of mocks.

(Ideally harness would pick up the lowest assumes juju from the charm's metadata.)

@benhoyt
Copy link
Collaborator

benhoyt commented May 15, 2024

In our team discussion, we decided the following:

  • We're planning not to add new features to Harness at this point, but focus our efforts on Scenario.
  • Scenario already has a Context(juju_version=...) arg that allows specifying this.
  • In the Scenario implementation, we'd use the juju_version parameter to support new features, but not get to the level of bugfixes (for example, the secrets breaking fix) -- for the latter, we'd just push people to upgrade to the latest patch release.

@benhoyt benhoyt closed this as not planned Won't fix, can't repro, duplicate, stale May 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request harness Related to ops.testing
Projects
None yet
Development

No branches or pull requests

2 participants