Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integration tests vs. real Salt #108

Open
AdamMajer opened this issue Sep 8, 2015 · 2 comments
Open

Integration tests vs. real Salt #108

AdamMajer opened this issue Sep 8, 2015 · 2 comments

Comments

@AdamMajer
Copy link
Member

Unit tests are a great resource. But there should also be a way of verifying that API works correctly vs. real Salt server instead of just a wiremock.

Recently, there are dozens and dozens of Salt calls added to the library. Most of these are simple, convenience mappings and they probably work just fine today. But what happens if Salt changes - eg. upstream decision to change something? Then maybe "status.diskusage" no longer exists for some reason, or it is replaced with a different return format. As is, we can't detect that. There is no automatic tests to verify the API remains valid (please correct me if I'm wrong here!).

There should be an automatic way to test this entire API vs. a real server.

1. automatically deploy container(s) and install salt-master and salt-minion(s) in it
2. test all API calls vs. that master

Comments welcome.

@moio
Copy link
Member

moio commented Sep 8, 2015

Great idea, this would be very cool to have (ideally in Travis CI or, if that's not feasible, in a Dockerized workload triggered by SUSE's internal Jenkins host).

Pull requests are very much welcome in this direction!

@AdamMajer
Copy link
Member Author

This can be done in Travis CI, to an extent, but maybe that approach is a little "overkill" since aim here would be to test functionality between salt versions. For regular testing, pre-generated test cases should suffice.

I propose,

  • record requests/responses from real salt - this can be done with wiremock, in "generate_tests" or similar target.
  • replay above in normal unit tests.
  • do this for all the calls.* packages

Then we get best of both worlds - reasonable fast unit tests, and tests vs. real salt during the "generate_tests" phase. I think it would be sufficient to test

  1. did call succeed?
  2. is result not empty? (ie. parsing of results worked)

Automatically generated tests would save a lot of boiler plate code (in unit tests) too, in addition to getting up-to-date saltstack responses.

I will see if I can get to this on Sunday or sometime next week. I've had very little free time in last few weeks 😢

mbologna added a commit to mbologna/salt-netapi-client that referenced this issue Nov 2, 2015
Inspired by issue SUSE#108, I changed Travis config
to launch a salt-master (+ salt-api) 
container and two salt-minion(s).

Salt-api is available at localhost:8000 during 
tests phase.

For more informations about the containers,
see docker/README.md
@mbologna mbologna mentioned this issue Nov 2, 2015
mbologna added a commit to mbologna/salt-netapi-client that referenced this issue Apr 26, 2017
Inspired by issue SUSE#108, I changed Travis config
to launch a salt-master (+ salt-api) 
container and two salt-minion(s).

Salt-api is available at localhost:8000 during 
tests phase.

For more informations about the containers,
see docker/README.md
mbologna added a commit to mbologna/salt-netapi-client that referenced this issue Sep 24, 2018
Inspired by issue SUSE#108, I changed Travis config
to launch a salt-master (+ salt-api) 
container and two salt-minion(s).

Salt-api is available at localhost:8000 during 
tests phase.

For more informations about the containers,
see docker/README.md
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants