Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Forward thresholds execution results when running k6 -o cloud scripts.js #2626

Closed
oleiade opened this issue Aug 1, 2022 · 2 comments
Closed

Comments

@oleiade
Copy link
Member

oleiade commented Aug 1, 2022

Feature Description

As of this writing, when k6 is run with its output set to the cloud (k6 run -o cloud script.js), it observes the same behavior as when k6 run is executed, and does not forward its thresholds results to the cloud backend. It results in thresholds results not being displayed in k6 cloud's web interface because their result were simply not sent.

In an ideal state, we would prefer for k6 run -o cloud to evaluate thresholds locally as it is the case currently, but we would also like it to include the results of the thresholds' execution in the output and forward it to the cloud to be able to interpret and display them.

Suggested Solution (optional)

It is yet unclear how we would address this, and will probably involve some non OSS work behind closed doors.

Already existing or connected issues / PRs (optional)

  • Part of the initial scope of #1443 was to actually define a shared representation format for thresholds, with the long-term idea to first unify execution, and then allow k6 to stream thresholds results to the cloud?
@oleiade oleiade added the feature label Aug 1, 2022
@oleiade oleiade self-assigned this Aug 1, 2022
@na--
Copy link
Member

na-- commented Aug 1, 2022

Hmm is this a regression? IIRC this used to work with old k6 versions, and the code still seems to be in the cloud output 😕 See:

err := out.testFinished()

k6/output/cloud/output.go

Lines 682 to 710 in 1e84682

func (out *Output) testFinished() error {
if out.referenceID == "" || out.config.PushRefID.Valid {
return nil
}
testTainted := false
thresholdResults := make(cloudapi.ThresholdResult)
for name, thresholds := range out.thresholds {
thresholdResults[name] = make(map[string]bool)
for _, t := range thresholds {
thresholdResults[name][t.Source] = t.LastFailed
if t.LastFailed {
testTainted = true
}
}
}
out.logger.WithFields(logrus.Fields{
"ref": out.referenceID,
"tainted": testTainted,
}).Debug("Sending test finished")
runStatus := lib.RunStatusFinished
if out.runStatus != lib.RunStatusQueued {
runStatus = out.runStatus
}
return out.client.TestFinished(out.referenceID, thresholdResults, testTainted, runStatus)
}

Thresholds ThresholdResult `json:"thresholds"`

@na-- na-- added cloud evaluation needed proposal needs to be validated or tested before fully implementing it in k6 labels Aug 1, 2022
@na--
Copy link
Member

na-- commented Aug 1, 2022

Ah, so the actual task is about the metric values that are compared to the threshold constraints, since we only send the threshold names and their pass/fail status at the moment:

type ThresholdResult map[string]map[string]bool

But then this is a duplicate of this older issue, so I'll close it: #1956

@na-- na-- closed this as completed Aug 1, 2022
@na-- na-- added duplicate and removed evaluation needed proposal needs to be validated or tested before fully implementing it in k6 labels Aug 1, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants