Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Service-Level Concurrency Control #217

Closed
pavelnemirovsky opened this issue Feb 18, 2024 · 1 comment
Closed

Service-Level Concurrency Control #217

pavelnemirovsky opened this issue Feb 18, 2024 · 1 comment
Assignees

Comments

@pavelnemirovsky
Copy link

Is your feature request related to a problem? Please describe.
Imagine I have a dynamic pool of containers that fluctuate in number. I need to guarantee that tasks from the queue are executed with consistent concurrency across the entire service, rather than on a per-application basis.

Describe the solution you'd like
I would like Rqueue to monitor the total number of tasks executed on a per-queue basis and ensure that the number of tasks executed concurrently aligns with the predefined limit for each queue (at the service level).

Describe alternatives you've considered
Create a specialized concurrency controller to oversee the total number of tasks executed across the service. For example, an implementation that utilizes Redis INCR/DECR commands.

Additional context
I apologize for opening a feature request; perhaps I didn't understand how to implement it based on the provided example. Thank you for your assistance.

@sonus21
Copy link
Owner

sonus21 commented Feb 18, 2024

Hi @pavelnemirovsky
Its very simple, you can employ RateLimiter at queue level using middleware

@sonus21 sonus21 closed this as completed May 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants