Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Threshold improvements #1136

Closed
na-- opened this issue Sep 2, 2019 · 4 comments
Closed

Threshold improvements #1136

na-- opened this issue Sep 2, 2019 · 4 comments
Labels
enhancement evaluation needed proposal needs to be validated or tested before fully implementing it in k6 feature

Comments

@na--
Copy link
Member

na-- commented Sep 2, 2019

Currently, you cannot specify a threshold for something like "requests per second", "errors per minute", or "values per period" in general.

Somewhat related to the above, threshold evaluations are made based only on all of the metrics gathered during the whole duration of the test. Or, more precisely (for when you have abortOnFail), based on the metrics gathered since the test has started up until a certain moment. That evaluation can be delayed by the delayAbortEval option, but there's no way to specify a time window of metrics to evaluate, like "p(99) of request duration for the last 30 minutes should be < X" or something like that.

Both of these things would be complicated to implement and need very careful evaluation if we should even do them. The main reason for not doing something like this, besides the complexity, is because don't want the required data crunching to affect the test execution and skew the results. We also don't want to explode the k6 memory usage further than it currently is (#1068). So these things are most likely prerequisites to any such effort: #961, #763

@na-- na-- added enhancement feature evaluation needed proposal needs to be validated or tested before fully implementing it in k6 labels Sep 2, 2019
@na--
Copy link
Member Author

na-- commented Nov 22, 2019

Another thing gleaned from @cajames's excellent k6 presentation (specifically, this question at the end). Somewhat related to to the improvements above, users might interested in setting thresholds only for a specific period of the test - for example, when they simulate a sharply spiked load on their system.

As mentioned in the demo, this can be achieved with custom metrics, though I think it may be somewhat hacky and tedious to set up that way. It might be slightly easier after #1007 is merged, though it's still going to be tricky. But since we're looking to more deeply introduce the concept of time in thresholds, this is a valid use case we probably should support when we enhance the thresholds. That way there wouldn't be any need for custom metrics.

Another approach would be to to tag metrics coming from different executors (after #1007) or stages (#796) and use the existing tag-based filtering for thresholds:

export let options = {
    thresholds: {
        "http_req_duration{stage:someStageID}": ["p(99)<250"],
   },
  // ...

This is probably the more robust approach in general, and easier to implement for us, but I can see some use cases where the time-based threshold configuration would be preferred, so we should probably do both.

@na--
Copy link
Member Author

na-- commented Dec 11, 2019

As it was pointed out in #1267 (comment), another improvement we can do (and this can probably be done even before we implement #961 or #763) is to calculate the thresholds on more than one core. Depending on how we implement the things above and in #763, it might make sense to have a single goroutine be responsible for all thresholds in a single metric?

@yorugac
Copy link
Contributor

yorugac commented Oct 13, 2021

A related case appeared in the forum here: thresholds there are used only as a workaround to get submetrics but the question was about how to get submetric rate for scenario duration and not for duration of the whole test. Scenarios in this case are a kind of substitute for time period in the test so it might make sense to consider that as option as well.

@codebien
Copy link
Collaborator

codebien commented Jan 5, 2024

Closing in favor of #2379 as it is a duplicate. It already links this issue.

@codebien codebien closed this as completed Jan 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement evaluation needed proposal needs to be validated or tested before fully implementing it in k6 feature
Projects
None yet
Development

No branches or pull requests

3 participants