Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrating with Locust #2646

Closed
2 tasks done
ichsansaid opened this issue Mar 21, 2024 · 4 comments
Closed
2 tasks done

Integrating with Locust #2646

ichsansaid opened this issue Mar 21, 2024 · 4 comments
Labels
bug stale Issue had no activity. Might still be worth fixing, but dont expect someone else to fix it

Comments

@ichsansaid
Copy link

Prerequisites

Description

Hello,

I'm currently developing a Low-Code Development Platform (LCDP) and want to create a pre-built component for performance testing.

I'm using FastAPI as the platform's backend and Locust for performance testing.

I have an API called execute_scenario that executes tasks. For example, I can execute a scenario called "LocustScenario."

My approach is to use Locust as a library. So, each executed scenario of "LocustScenario" will create an environment and start the performance test. I also capture the statistics every second (using gevent.sleep) and send them to our message broker.

However, I'm facing two issues:

First Issue: monkey.patch_all() causes blocking in FastAPI. Therefore, I decided to remove it from the code manually in init.py for now.
Second Issue: When I perform a performance test for HTTP, there are no issues. Our message broker receives the messages every second. However, when I perform a performance test for HTTPS, there is a delay of 3-5 seconds (somewhat randomly) when capturing the statistics every second using gevent.sleep(1).

Command line

library

Locustfile contents

def start(self, max_user: int, ramp: int, duration: int = 10, on_live_stats: Callable = None):
        try:
            gevent.spawn(NewOnLiveStats(on_live_stats), self)
            self.runner.start(max_user, spawn_rate=ramp)
            if duration > -1:
                gevent.spawn_later(duration, self.runner.quit)
            self.runner.greenlet.join()
        finally:
            self.on_finish(self)

This is the method that will start the environment

Python version

3.12

Locust version

latest

Operating system

windows

@ichsansaid ichsansaid added the bug label Mar 21, 2024
@cyberw
Copy link
Collaborator

cyberw commented Mar 22, 2024

Hi! It sounds like you are running locust in the same process as the thing you are testing? Dont do that!

If necessary, use process.Popen(”locust …”, …) in your application to launch locust in a separate process. But ideally, a load testing tool should never be running on the same machine as the thing you are testing.

@ichsansaid
Copy link
Author

Hi! It sounds like you are running locust in the same process as the thing you are testing? Dont do that!

If necessary, use process.Popen(”locust …”, …) in your application to launch locust in a separate process. But ideally, a load testing tool should never be running on the same machine as the thing you are testing.

Hi cyber, thanks for the reply. I hope i can separate the Process with Popen but i can't because i want to get the stats in every 1 second. If i do that in different process, i can't get the stats. So that's why i run Locust as Library

@cyberw
Copy link
Collaborator

cyberw commented Mar 22, 2024

Why cant you get the stats in every second? Maybe you can do gevent.spawn(NewOnLiveStats(on_live_stats), self) in the test_start event handler? (idk what NewOnLiveStats is though :)

@cyberw cyberw added the stale Issue had no activity. Might still be worth fixing, but dont expect someone else to fix it label May 10, 2024
Copy link

This issue was closed because it has been stalled for 10 days with no activity. This does not necessarily mean that the issue is bad, but it most likely means that nobody is willing to take the time to fix it. If you have found Locust useful, then consider contributing a fix yourself!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug stale Issue had no activity. Might still be worth fixing, but dont expect someone else to fix it
Projects
None yet
Development

No branches or pull requests

2 participants