New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Congestion Avoidance #306
Comments
The problem is on line-96 of the load balancer. It accepts all the incoming requests irrespective of the existing load. We can put a restriction saying, the request is only accepted when it satisfies the following conditions.
Hint: We need to give an load message to the user. We can also apply the following builtin methods of ES6 Array object.
We need to refactor lines 145-174 into a separate module named congestion_avoid and importing it as a function to be used near line-96. |
@AnkshitJain This seems simpler than having a separate micro service for congestion management. What do you think? |
The above approach is good if there are no evaluation failures in execution nodes. There are going to be cases where execution nodes fail with exception due to corner cases and abandon the evaluation. In such cases, the pending evaluation remains pending in the load balancer. We had a similar problem in the past when users submit evaluation from webpage, refresh the webpage and submit a fresh request. This problem also arises when user submits evaluation requests from multiple browser tabs. A better solution is to use node-cache which offers configurable timeouts of pending entries in a map. The salient features of the module are:
The module has nice callbacks that get called on specific events happening on the cache. So a simple check would be as follows. if myCache.get(id) != undefined
//send a JSON indicating a pending evaluation
else if myCache.getStats().keys > 20
//send a JSON indicating a pending evaluation
else
//evaluate
myCache.set(req.body.id, req.body); The requirement is that we change the queue from array to a cache. A good way to achieve this change is to import the |
Check node-rate-limiter or express-rate-limit or bottleneck |
We can use request-ip to consider the client IP from which the evaluation requests are received. |
Description
The number of pending requests grow exponentially towards the assignment submission deadline. This is due to two reasons.
Steps to Reproduce
Expected behavior: [What you expect to happen]
System quickly and transparently rejects excess load.
Actual behavior: [What actually happens]
System slows down significantly or crashes.
Reproduces how often:
Every time near the submission deadline
Additional Information
Silently failing the requests is a quick fix, but not a good solution.
The text was updated successfully, but these errors were encountered: