Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix #20139: Surface Acceptance Test results in a more convenient location #20171

Open
wants to merge 30 commits into
base: develop
Choose a base branch
from

Conversation

rahat2134
Copy link
Contributor

@rahat2134 rahat2134 commented Apr 16, 2024

Overview

  1. This PR fixes or fixes part of [Feature Request]: Surface Acceptance Test results in a more convenient location #20139
  2. This PR does the following: When looking at the results of acceptance tests in the GitHub workflows, perhaps the most important part of the log, the test results, are buried under logs for parent and supporting processes. (i.e. the initialization and shutdown of the server.).This PR will fix that.

Essential Checklist

Please follow the instructions for making a code change.

  • I have linked the issue that this PR fixes in the "Development" section of the sidebar.
  • I have checked the "Files Changed" tab and confirmed that the changes are what I want to make.
  • I have written tests for my code.
  • The PR title starts with "Fix #bugnum: " or "Fix part of #bugnum: ...", followed by a short, clear summary of the changes.
  • I have assigned the correct reviewers to this PR (or will leave a comment with the phrase "@{{reviewer_username}} PTAL" if I can't assign them directly).

Proof that changes are correct

image

PR Pointers

@rahat2134 rahat2134 requested a review from a team as a code owner April 16, 2024 11:37
Copy link

oppiabot bot commented Apr 16, 2024

Hi @rahat2134, can you complete the following:

  1. The body of this PR is missing the proof that changes are correct section, please update it to include the section.
    Thanks!

Copy link

oppiabot bot commented Apr 16, 2024

Assigning @StephenYu2018 for the first pass review of this PR. Thanks!

@rahat2134 rahat2134 requested review from agallop and removed request for StephenYu2018 April 16, 2024 11:37
@rahat2134 rahat2134 assigned agallop and unassigned StephenYu2018 and rahat2134 Apr 16, 2024
@rahat2134
Copy link
Contributor Author

@agallop PTAL.

@agallop
Copy link
Collaborator

agallop commented Apr 16, 2024

Hello Rahat,

The goal of the original issue is to make it so there is a step in the github action that only prints the test output.

image


For example, I would like this action to only output.

Running suite with 6 specs.

--------------------------------------------------
| 1. Suite started: Logged-in User in About page |
--------------------------------------------------

Spec started : Logged-in User in About page should open Math Classroom page with the Browse Our Lessons button.
-> Passed [ Took 3.04 seconds ]

Spec started : Logged-in User in About page should open Android page with the Access Android App button.
-> Passed [ Took 2.994 seconds ]

Spec started : Logged-in User in About page should open Math Classroom page with the Visit Classroom button.
-> Passed [ Took 3.028 seconds ]

Spec started : Logged-in User in About page should open Community Library page with the Browse Library button.
-> Passed [ Took 3.185 seconds ]

Spec started : Logged-in User in About page should open Creator Dashboard page and Exploration Editor with the Create Lessons button
LOG: The Create Lessons button opens the Creator Dashboard in Create Mode.
LOG: The Create Lessons button displays the Exploration Editor page.
-> Passed [ Took 7.009 seconds ]

Spec started : Logged-in User in About page should open Math Classroom page with the Explore Lessons button.
-> Passed [ Took 5.337 seconds ]



6 specs, 0 failures
Finished in 40.275 seconds

The full output should still print in a separate expansion panel and be able to be printed when running locally.

@agallop agallop assigned rahat2134 and unassigned agallop Apr 16, 2024
Copy link
Contributor

@StephenYu2018 StephenYu2018 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A couple of concerns.

scripts/run_acceptance_tests_test.py Outdated Show resolved Hide resolved
@@ -349,3 +350,17 @@ def test_start_tests_for_long_lived_process(self) -> None:
with self.swap_mock_set_constants_to_default:
with self.swap(constants, 'EMULATOR_MODE', True):
run_acceptance_tests.main(args=['--suite', 'testSuite'])

def test_print_test_output(self) -> None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please include the result of printing test output in the test name

@@ -1,4 +1,4 @@
# Copyright 2023 The Oppia Authors. All Rights Reserved.
# Copyright 2024 The Oppia Authors. All Rights Reserved.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this! :)

scripts/run_acceptance_tests_test.py Outdated Show resolved Hide resolved
@rahat2134 rahat2134 requested a review from a team as a code owner April 21, 2024 06:28
@rahat2134 rahat2134 requested review from U8NWXD and removed request for U8NWXD April 21, 2024 06:28
@rahat2134
Copy link
Contributor Author

correcting the failing backend tests.

@rahat2134 rahat2134 marked this pull request as draft April 29, 2024 10:41
@rahat2134 rahat2134 marked this pull request as ready for review April 30, 2024 16:45
@rahat2134
Copy link
Contributor Author

@agallop PTAL.

@rahat2134 rahat2134 assigned agallop and unassigned rahat2134 Apr 30, 2024
@@ -392,8 +392,12 @@ jobs:
run: python -m scripts.build --prod_env
- name: Run Desktop Acceptance Test ${{ matrix.suite }}
run: xvfb-run -a --server-args="-screen 0, 1285x1000x24" python -m scripts.run_acceptance_tests --skip-build --suite=${{ matrix.suite }} --prod_env
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add a flag for this new behavior and set it here?

@@ -177,6 +194,9 @@ def main(args: Optional[List[str]] = None) -> None:
with servers.managed_portserver():
_, return_code = run_tests(parsed_args)

with open('test_output.log', 'r', encoding='utf-8') as output_file:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test output is being printed a second time here and that's not necessary.

Copy link
Collaborator

@agallop agallop left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Rahat! Some more comments

@@ -99,6 +99,21 @@ def compile_test_ts_files() -> None:
os.path.join(build_dir_path, 'images'))


def print_test_output(output_lines: List[bytes]) -> None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do think this is all very complex. You can just print all the lines.

In the event of a failure, I don't think this will print the call stack as well. Which we would want to see as it tells you where and why the test failed.

@@ -392,8 +392,12 @@ jobs:
run: python -m scripts.build --prod_env
- name: Run Desktop Acceptance Test ${{ matrix.suite }}
run: xvfb-run -a --server-args="-screen 0, 1285x1000x24" python -m scripts.run_acceptance_tests --skip-build --suite=${{ matrix.suite }} --prod_env
- name: Display Desktop Test Output
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Display Desktop Acceptance Test Output

- name: Run Mobile Acceptance Test ${{ matrix.suite }}
run: xvfb-run -a --server-args="-screen 0, 1285x1000x24" python -m scripts.run_acceptance_tests --skip-build --suite=${{ matrix.suite }} --prod_env --mobile
- name: Display Mobile Test Output
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Display Mobile Acceptance Test Output

@oppiabot oppiabot bot unassigned agallop May 4, 2024
Copy link

oppiabot bot commented May 4, 2024

Unassigning @agallop since the review is done.

Copy link

oppiabot bot commented May 4, 2024

Hi @rahat2134, it looks like some changes were requested on this pull request by @agallop. PTAL. Thanks!

Copy link

oppiabot bot commented May 11, 2024

Hi @rahat2134, I'm going to mark this PR as stale because it hasn't had any updates for 7 days. If no further activity occurs within 4 days, it will be automatically closed so that others can take up the issue.
If you are still working on this PR, please make a follow-up commit within 4 days (and submit it for review, if applicable). Please also let us know if you are stuck so we can help you!

@oppiabot oppiabot bot added the stale label May 11, 2024
@oppiabot oppiabot bot removed the stale label May 11, 2024
Copy link

oppiabot bot commented May 18, 2024

Hi @rahat2134, I'm going to mark this PR as stale because it hasn't had any updates for 7 days. If no further activity occurs within 4 days, it will be automatically closed so that others can take up the issue.
If you are still working on this PR, please make a follow-up commit within 4 days (and submit it for review, if applicable). Please also let us know if you are stuck so we can help you!

@oppiabot oppiabot bot added the stale label May 18, 2024
@oppiabot oppiabot bot removed the stale label May 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants