Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve threshold parsing and errors #961

Closed
na-- opened this issue Mar 15, 2019 · 8 comments
Closed

Improve threshold parsing and errors #961

na-- opened this issue Mar 15, 2019 · 8 comments
Labels
enhancement evaluation needed proposal needs to be validated or tested before fully implementing it in k6 high prio refactor ux
Milestone

Comments

@na--
Copy link
Member

na-- commented Mar 15, 2019

Currently, threshold calculations are actually executed in mini JavaScript runtimes: https://github.com/loadimpact/k6/blob/master/stats/thresholds.go

That's hard to beat for flexibility, but in the long term, we probably should transition to a simple DSL that's easy to parse and verify. It will likely be needed for things like the distributed execution anyway, but it will definitely be useful for reducing user errors and confusion. For example, here are some thresholds that are wrong, but won't actually abort the k6 execution:

export let options = {
    thresholds: {
        "http_req_duration": [
            // this never fails...
            "avg=123",
            // this only emits a warnings every second
            "throw new Error('wat')",
            // this is never reached... but if it was, it will only emit a warning as well
            "test>=500",
        ],
    },
};

As a short-term fix, we likely should make the thresholds stop the script when there are errors. And it's worth investigating if we can make the values we set in the VM consts, so using = instead of == produce an error.

@na-- na-- added enhancement ux refactor evaluation needed proposal needs to be validated or tested before fully implementing it in k6 labels Mar 15, 2019
@na--
Copy link
Member Author

na-- commented Apr 9, 2019

For the new threshold parsing code, we might consider this library: https://github.com/PaesslerAG/gval

I saw it when researching the jsonpath libraries @robingustafsson mentioned in #992, the second one is based on it.

@na--
Copy link
Member Author

na-- commented Nov 22, 2019

We stumbled on a k6 demo presentation and spotted a somewhat connected issue, using rate in the threshold for a Counter metric: https://youtu.be/Hu1K2ZGJ_K4?t=1103

Ideally, this should have resulted in a configuration error before the script even started, since combining these things doesn't make sense. Unfortunately, because of the current laizes-faire threshold implementation, the script ran successfully 😞 . In the demo they quickly noticed that there was an issue, since they were expecting an error. But in a real test setup, a k6 user might set up a threshold slightly incorrectly like that, and then run their script in a CI with the potentially false assurance that their system performs below the specified thresholds...

Thanks, @cajames, for your very good demo! I guess us using it as a mini case study probably wasn't its original purpose, but it was very useful for us nonetheless 😅 Feel free to ping us here or on Slack if you have seen any other bugs or if you have suggestions for improving k6 in any way!

@na-- na-- added the high prio label Nov 22, 2019
@cajames
Copy link

cajames commented Nov 26, 2019

Hey @na--, thanks for watching!

In the demo they quickly noticed that there was an issue, since they were expecting an error.

Yep, I had chalked it up to me passing the wrong configuration, but if k6 did throw an error on startup in the case of threshold misconfiguration, that'd be a great dev experience!

I guess us using it as a mini case study probably wasn't its original purpose, but it was very useful for us nonetheless.

Happy it could be a helpful case study. :)

Feel free to ping us here or on Slack if you have seen any other bugs or if you have suggestions for improving k6 in any way!

And will do! Thanks. Really like what you guys have created!

@na--
Copy link
Member Author

na-- commented Feb 1, 2021

#1832 (or something like it) is a prerequisite for the current issue.

@imiric
Copy link
Contributor

imiric commented Feb 1, 2021

Not sure if this is the right place to mention this, but currently thresholds defined for specific scenarios are not triggered. See this forum post.

This doesn't seem to require the overhaul of #1832, so should I create a new issue for it?

@mstoykov
Copy link
Collaborator

mstoykov commented Feb 1, 2021

Not sure if this is the right place to mention this, but currently thresholds defined for specific scenarios are not triggered. See this forum post.

This is not "true". VUs and VUsMax are being emitted completely independently of scenarios or anything else:
https://github.com/loadimpact/k6/blob/586cd110358a55f5dff3c1dc88c1980e8f71a7ea/core/engine.go#L340-L362
The only way for them to get tags is to add global system tags through --tag.

But it will be nice if VUs and VUsMax were actually emitted by the executors and were tagged accordingly. I think we discussed this around #1007 , I don't know if there is an issue

@na--
Copy link
Member Author

na-- commented Feb 1, 2021

Yeah, we discussed something like that, but never got around to implementing it, since it would have delayed #1007 even more, and since it would have been a minor breaking change. I can see a benefit of emitting vus and vus_max per-scenario, so @imiric please create a new issue for that, though leave it low prio + evaluation needed, since it's not clear we should do the change. For example, vux_max, it might be better if we retire that metric and use something like vus_initialized or something like that.

@oleiade
Copy link
Member

oleiade commented Mar 10, 2022

The scope of the initial definition is now covered in v0.37.0 following #2400 and the new Go-based thresholds parsing.

export let options = {
    thresholds: {
        "http_req_duration": [
            // this never fails...
            "avg=123",
            // this only emits a warnings every second
            "throw new Error('wat')",
            // this is never reached... but if it was, it will only emit a warning as well
            "test>=500",
        ],
    },
};

Would result in k6 failing before start with

ERRO[0000] failed parsing threshold expression "my invalid expression"; reason: malformed threshold expression

@oleiade oleiade closed this as completed Mar 10, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement evaluation needed proposal needs to be validated or tested before fully implementing it in k6 high prio refactor ux
Projects
None yet
Development

No branches or pull requests

5 participants