Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove the core.Engine 馃帀 #2815

Merged
merged 7 commits into from Feb 2, 2023
Merged

Remove the core.Engine 馃帀 #2815

merged 7 commits into from Feb 2, 2023

Conversation

na--
Copy link
Member

@na-- na-- commented Dec 8, 2022

This PR builds on top of #2813 and closes #1889, finally 馃帀 馃帄 It also leaves us in a much better position to implement #1342, #140, #2804, #2430, and a myriad of other issues 馃帀

Some of the less important tests still need to be ported, so it it still technically a draft PR that's not quite ready to merge as it is. Still, given the fact that the new integration tests all work fine, I am fairly confident there aren't any major problems 馃 I also don't intend to make any big changes to the code here, so it's more than ready for review.

@na-- na-- marked this pull request as ready for review December 9, 2022 09:05
@na-- na-- requested a review from codebien December 9, 2022 11:06
@na-- na-- added this to the v0.43.0 milestone Dec 9, 2022
cmd/run.go Outdated Show resolved Hide resolved
Comment on lines +167 to +176
// We call waitOutputsFlushed() below because the threshold calculations
// need all of the metrics to be sent to the MetricsEngine before we can
// calculate them one last time. We need the threshold calculated here,
// since they may change the run status for the outputs.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead, to do this by the output manager, could we do an explicit call to the ingester? I think it would simplify a bit the logic.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, maybe, not sure it would be simpler though 馃 here in the defer statements, we just have the following sequence after the test run finishes:

  1. close the samples channel
  2. wait for all outputs to finish flushing their metrics
  3. finalize the thresholds (unless --no-thresholds was used)
  4. stop all outputs

to me, this makes sense even if we completely ignore thresholds 馃 and I am not sure how we can simplify it, when the piping of the metrics also happens by the output.Manager 馃

Copy link
Collaborator

@codebien codebien Feb 2, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not blocking point: I think it could be helpful to add these 4 points to the comment so we have one central place where we can read the flow without going across the entire code of the Run method. I think it could be helpful when we will work on #2430.

Copy link
Member Author

@na-- na-- Feb 2, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

馃憤, but let's add this in the PR that converts the few remaining tests for the old Engine, I'll merge this one shortly

output/manager.go Show resolved Hide resolved
metrics/engine/engine.go Outdated Show resolved Hide resolved
errext/exitcodes/codes.go Show resolved Hide resolved
cmd/run.go Show resolved Hide resolved
cmd/run.go Show resolved Hide resolved
@codecov-commenter
Copy link

codecov-commenter commented Jan 16, 2023

Codecov Report

Merging #2815 (e73e804) into master (134ca65) will increase coverage by 0.03%.
The diff coverage is 82.85%.

@@            Coverage Diff             @@
##           master    #2815      +/-   ##
==========================================
+ Coverage   76.81%   76.84%   +0.03%     
==========================================
  Files         225      222       -3     
  Lines       16867    16846      -21     
==========================================
- Hits        12956    12945      -11     
+ Misses       3078     3065      -13     
- Partials      833      836       +3     
Flag Coverage 螖
ubuntu 76.84% <82.85%> (+0.13%) 猬嗭笍
windows ?

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage 螖
cmd/ui.go 68.87% <酶> (-1.03%) 猬囷笍
cmd/run.go 71.20% <76.27%> (-0.30%) 猬囷笍
output/manager.go 84.09% <91.66%> (+13.25%) 猬嗭笍
execution/scheduler.go 93.09% <100.00%> (+0.26%) 猬嗭笍
metrics/engine/engine.go 88.88% <100.00%> (+4.06%) 猬嗭笍
loader/filesystems.go 60.00% <0.00%> (-40.00%) 猬囷笍
js/common/initenv.go 66.66% <0.00%> (-33.34%) 猬囷笍
errext/abort_reason.go 50.00% <0.00%> (-25.00%) 猬囷笍
api/v1/metric.go 68.18% <0.00%> (-13.64%) 猬囷笍
lib/testutils/minirunner/minirunner.go 77.02% <0.00%> (-5.41%) 猬囷笍
... and 18 more

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

Copy link
Collaborator

@codebien codebien left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 馃帀

Base automatically changed from prune-the-engine to master January 27, 2023 09:30
@na--
Copy link
Member Author

na-- commented Jan 30, 2023

I split off the first commit into a separate PR (#2885) to fix a recent regression introduced by 19dd505 from #2813. See the new PR for more details, and I'll rebase this PR on top of it.

@na-- na-- force-pushed the remove-the-engine branch 2 times, most recently from 16c82bc to ee62441 Compare January 30, 2023 17:07
na-- added a commit that referenced this pull request Jan 30, 2023
This is the first commit from #2815, plus an extra test that checks the exit code when aborting a `k6 run --paused` test with Ctrl+C. 

Unfortunately, it turned out that 19dd505 from #2813 introduced a regression. Interrupting the k6 process with a signal should result in a `105` exit code (`ExternalAbort`), but with that commit k6 exited with `103` if k6 was `--paused`. The good news is that the fix was already done in the first commit of #2815, which was entirely self-sufficient and can be moved to this separate PR. As a bonus, this makes reviewing everything even easier... 

So, some details about the commit itself... In short, there is no need to have 2 separate `Run()` and `Init()` methods in the `execution.Scheduler`. Instead, Run() can start the VU initialization itself. As a bonus, this immediately makes the error handling around init errors much more in line with other error handling, allowing us to resolve the bug, but also to respect `--linger` and to try and execute `handleSummary()` if there were problems. Except the first init that is used to get the exported options, of course, that one is still special.
cmd/run.go Outdated Show resolved Hide resolved
codebien and others added 3 commits February 2, 2023 10:21
This reverts a big part of #2885 because with all of the logic in Run(), k6 didn't know if it needed to wait when --linger was specified.

In general, if a script error or test.abort() occurs during init, --linger should not apply and k6 should exit immediately. But if test.abort() is called during the test run itself, or the test was stopped in some other way besides Ctrl+C, --linger means that k6 should not exit immediately after stopping the test.
Copy link
Collaborator

@codebien codebien left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Remove the Engine and split metrics/VUs handling
4 participants