Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The serenity report is marking failed tests as passed #3433

Open
deepthi-ravindra opened this issue Apr 5, 2024 · 15 comments
Open

The serenity report is marking failed tests as passed #3433

deepthi-ravindra opened this issue Apr 5, 2024 · 15 comments

Comments

@deepthi-ravindra
Copy link

What happened?

Serenity report has failures/errors. While on CI, there are only flakes and passed tests reported on console too. The failed tests are mainly to do with wait until visible, wait until displayed. these are not reported and it marks the report as GREEN when this happens.

Serenity version: 4.1.0

What did you expect to happen?

No response

Serenity BDD version

4.1.0

JDK version

17

Execution environment

No response

How to reproduce the bug.

Running 400+ tests parallel execution on Azure devops and 1 or 2 fails with element not displayed/visible. But test report is green. However, after downloading report , there are failures/errors.

How can we make it happen?

Work on this myself and propose a PR (with Serenity BDD team guidance)

@wakaleo
Copy link
Member

wakaleo commented Apr 5, 2024

Try to create a test that reproduces the issue.

@Moncada25
Copy link

Hi @wakaleo, I have the same error, I tracked the versions and it happens since 4.0.46 (same version of the gradle plugin), although it only happens in classes that implement Question interface, here is an example.

Screenshot 2024-04-07 at 11 33 06 PM

Using version 4.0.30

image

Using version 4.0.46+

image

JVM: 17 and 21
SO: Mac OS 14.4
Gradle: 7.6 and 8.7

@wakaleo
Copy link
Member

wakaleo commented Apr 8, 2024

A Question should not contain another assertion, it should return a value, e.g.

override fun answeredBy(actor: Actor) : Boolean {
    return actual.equals(expected);
}

@Moncada25
Copy link

Is this a new behavior then? because in previous versions it didn't work like that, asserts were possible anywhere you had access to the actor... really the example you show is the simplest there can be, the validations of my test cases do not cover a single assert, even In the simplest scenario of just calling an endpoint, it should validate not only the status code but also the body, body fields, schema, etc... for this I should make a Question class for each assert? or how should it work? I think I'll use version 4.0.30 for eternity 😅

@wakaleo
Copy link
Member

wakaleo commented Apr 8, 2024

It's not an intentional change but having an assertion inside a question isn't a use case we would test - the purpose of a question is to query the state of the system, not to make an assertion - you make an assertion with the value returned by the question. Putting the assertion inside the question code may upset the exception handling logic.

@zeners
Copy link
Contributor

zeners commented Apr 18, 2024

maybe related to #3443 ?

@deepthi-ravindra
Copy link
Author

deepthi-ravindra commented Apr 25, 2024

This issue still persists with version 4.1.10.
Versions am using:
<serenity.version>4.1.0</serenity.version>
<serenity.cucumber.version>4.1.0</serenity.cucumber.version>
<cucumber.version>7.15.0</cucumber.version>
<open.csv.version>5.5.2</open.csv.version>
<log4j.version>2.17.1</log4j.version>
<slf4j.version>2.0.0-alpha6</slf4j.version>
<jsch.version>0.1.55</jsch.version>
<java.faker.version>1.0.2</java.faker.version>
<java.version>21</java.version>
<cucable.plugin.version>1.11.0</cucable.plugin.version>
<maven.failsafe.plugin.version>3.2.5</maven.failsafe.plugin.version>
<maven.surefire.plugin.version>3.2.5</maven.surefire.plugin.version>
junit - 4.13.2
Java 21
Apparently, junit plugin shows correct failures/errors but not the serenity report.. the serenity junit integration may have the problem.
Serenity maven plugin is like this in pom.xml:

net.serenity-bdd.maven.plugins
serenity-maven-plugin
${serenity.version}

single-page-html



net.serenity-bdd
serenity-single-page-report
${serenity.version}




serenity-reports
post-integration-test

aggregate



Am I doing something wrong?

@wakaleo
Copy link
Member

wakaleo commented Apr 25, 2024

Yes, questions should not perform assertions, they should just query the state of the system and return a value.

@deepthi-ravindra
Copy link
Author

No assertions are happening here and am not even using actor based model . it is simple statement like: editButton.waitUntilVisible(); where it is failing.. I added a bit of implicit wait and it is ok .. looks like test was failing at this place but report was marked as green. However, in the console, the end summary did say 1 error..
Test scenarios executed | 470
[INFO] | Total Test cases executed | 484
[INFO] | Tests passed | 483
[INFO] | Tests failed | 0
[INFO] | Tests with errors | 1
[INFO] | Tests compromised | 0
[INFO] | Tests aborted | 0
[INFO] | Tests pending | 0
[INFO] | Tests ignored/skipped | 0
so not sure why the report was marked as green.

@deepthi-ravindra
Copy link
Author

deepthi-ravindra commented Apr 25, 2024

There were flakes where the failed tests reran till they passed.. but this test is not even marked as flake and did not re-run. This is just 1 instance.. this is happening with waitUntilPresent, etc. Here is the logs at end of console:

[WARNING] Tests run: 480, Failures: 0, Errors: 0, Skipped: 0, Flakes: 10
[INFO]
[INFO]
[INFO] --- serenity-maven-plugin:4.1.10:aggregate (serenity-reports) @ od-qa-automation ---
[INFO] GENERATING REPORTS FOR: /agent/_work/1/s
[INFO] GENERATING REPORTS USING 32 THREADS
[INFO] GENERATING SUMMARY REPORTS...
[INFO] GENERATING REQUIREMENTS REPORTS...
[INFO] GENERATING RESULT REPORTS...
[INFO] GENERATING ERROR REPORTS...
[INFO] Test results for 484 tests generated in 11.5 secs in directory: file:/agent/_work/1/s/target/site/serenity/
[INFO] ------------------------------------------------
[INFO] | SERENITY TESTS: | ERROR
[INFO] ------------------------------------------------
[INFO] | Test scenarios executed | 470
[INFO] | Total Test cases executed | 484
[INFO] | Tests passed | 483
[INFO] | Tests failed | 0
[INFO] | Tests with errors | 1
[INFO] | Tests compromised | 0
[INFO] | Tests aborted | 0
[INFO] | Tests pending | 0
[INFO] | Tests ignored/skipped | 0
[INFO] ------------------------------- | --------------
[INFO] | Total Duration| 17h 46m 6s
[INFO] | Fastest test took| 30s 239ms
[INFO] | Slowest test took| 7m 39s
[INFO] ------------------------------------------------
[INFO]
[INFO] SERENITY REPORTS
[INFO] - Full Report: file:///agent/_work/1/s/target/site/serenity/index.html
[INFO] - Single Page HTML Summary: file:///agent/_work/1/s/target/site/serenity/serenity-summary.html

--- maven-failsafe-plugin:3.2.5:verify (default) @ od-qa-automation ---
[INFO] Failsafe report directory: /agent/_work/1/s/target/failsafe-reports
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 28:34 min
[INFO] Finished at: 2024-04-25T16:11:28Z
[INFO] ------------------------------------------------------------------------

See the build is marked as SUCCESS ?

@wakaleo
Copy link
Member

wakaleo commented Apr 25, 2024

If you are rerunning failed tests with JUnit that might be related - I never do that so don't know if it is supported.

@deepthi-ravindra
Copy link
Author

Re-running failed tests using failsafe plugin

@wakaleo
Copy link
Member

wakaleo commented Apr 25, 2024

Serenity is not aware of rerun tests so will just record the latest result.

@deepthi-ravindra
Copy link
Author

As I mentioned before, this particular test was not rerun. It just ran once and marked as error. All failed n error tests do rerun. But such errors do not rerun.

@wakaleo
Copy link
Member

wakaleo commented Apr 25, 2024

I can't reproduce this, so I presume there is some project-specific configuration issue going on. Can you investigate and propose a PR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants