Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: add maxUsers #4462

Open
Amerousful opened this issue Sep 9, 2023 · 10 comments
Open

Feature: add maxUsers #4462

Amerousful opened this issue Sep 9, 2023 · 10 comments

Comments

@Amerousful
Copy link
Contributor

Amerousful commented Sep 9, 2023

Hey!
I suggest adding a maxUsers method (similar to the maxDuration) for the interrupt test. It can be useful for the closed model when you aim to pass only a certain number of users and you don't know how long it takes.
What do you think? Is it good idea?

@slandelle
Copy link
Member

Hey,

Sorry, it don't get how it would work and help for the use case you mentioned. Could you please elaborate?

I maybe see a use case for the open work load model (concurrent users are piling up because the SUT struggles) but then it looks similar to the panic criteria we're implementing for Gatling Enterprise and then I would also expect a criteria on the response times and the error ratio to trigger.

@Amerousful
Copy link
Contributor Author

Amerousful commented Sep 11, 2023

Okay, I'll try to explain with an abstract example. Let's imagine that I have an application with 2 threads, and I have to load it with the closed workload model. Also, I wanted to process a certain number of requests - 100, but we don't know how long it takes for each request: 100ms, 1s, 5s, 30s, and so on, and we cannot know the whole test duration.
constantConcurrentUsers(2) during (??? minutes)

There is a workaround:

.doIf(session => session.userId == 100)(
        stopInjector()
      )

But, maxUsers seems more appropriate and intuitive

@slandelle
Copy link
Member

But constantConcurrentUsers is not going to increase the number of concurrent users. Why would the response times increase?

@Amerousful
Copy link
Contributor Author

Amerousful commented Sep 11, 2023

performance problems...
some sort of cumulative issues when processing time is increasing after a while

@slandelle
Copy link
Member

OK, so maxUsers would be matched against the total number of users since the beginning of the test.
Honestly, this seems like a very specific use case. And there's the workaround you've indicated.
Let's see for a while if this ticket gets some traction.

@Amerousful
Copy link
Contributor Author

Ok) Got it

@Amerousful
Copy link
Contributor Author

Amerousful commented Sep 11, 2023

Perhaps I have a better example. With duration for the closed model we are considering how many processes were by a certain time period
For maxUsers - it's just the opposite. How much time do we need to process a specific number of processes?

Manager: We have a client who wants to upload 1000 documents to our system. We have 2 app-workers which are processing this operation. Question: How long it takes?

@gemiusz
Copy link
Contributor

gemiusz commented Sep 14, 2023

Manager: We have a client who wants to upload 1000 documents to our system. We have 2 app-workers which are processing this operation. Question: How long it takes?

I think that in this specific scenario you should use Feeder with Queue strategy but I'm not sure how create proper scenario...
Maybe loop of request filled by feeder inside a group will do a case.
WDYT @slandelle

@Amerousful
Copy link
Contributor Author

@gemiusz This has nothing to do with whether there will be a feeder or not.
Here, we are primarily discussing a closed model of workload and how many total requests we need to process.

@gemiusz
Copy link
Contributor

gemiusz commented Sep 15, 2023

@Amerousful I'm looking for other solution for your case, so I try figure out if this will help you, but looking at #4463 this can be problematic :/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

3 participants