Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

User-implemented rate limiting #1051

Open
Yahweasel opened this issue Oct 19, 2020 · 3 comments
Open

User-implemented rate limiting #1051

Yahweasel opened this issue Oct 19, 2020 · 3 comments

Comments

@Yahweasel
Copy link

It would be nice if the client allowed the user to inject functions to handle rate limiting. Say, an async function waitForRateLimit that takes the route as an argument, and a function updateRateLimit that takes the headers in the response as an argument. The REST code would await waitForRateLimit before actualCall, then call updateRateLimit when it's done. This could be on top of the existing rate limiting, so a naïve user couldn't simply bypass rate limiting and blow themselves up.

To explain why I would want this, my bot has recently been getting rate limited to Hell. Total IP-block from Discord's API services for hours at a time. Eris's rate limiting is correctly implemented (except for the absence of buckets...), but entirely managed within each Eris client. I have 8 bots times 128 shards (process per shard) equals 1024 Eris clients running simultaneously. Even if each independently respects the rate limiting header, if more than a few of those make requests too quickly, it snowballs and they all get blocked.

My solution was to fork Eris and rewrite the rate limiting code to use a shared database (as well as the route→bucket mapping). This appears to be working, but is obviously a nonstarter for upstreaming into Eris proper; my use of Eris is extremely unusual, and the common case shouldn't be made this complex. Most people don't want to pull in sqlite3. But, I'd also prefer not to maintain a fork forever just for this one little corner, nor to monkeypatch Eris's core infrastructure from the client code.

I'm hoping that something like this, having user-injected functions to handle rate limiting, might be a workable compromise. Any users in unusual cases like mine would need to handle the sharing of rate limiting data on their own, but at least they'd have the ability to do so without forking Eris.

Alternatively, perhaps a database solution like mine would be acceptable if it were a user-configurable alternative, so most users didn't need the overhead or dependency. In that case, I'd be happy to write it and make a pull request.

@bsian03
Copy link
Collaborator

bsian03 commented Oct 19, 2020

It sounds like you'd be better off having a separate process serving REST requests (essentially a middleman) so that you won't get rate limit from all the clients not working together

@Yahweasel
Copy link
Author

Possibly, but that is also not something Eris can do, so it's not really relevant.

@bsian03
Copy link
Collaborator

bsian03 commented Oct 19, 2020

Though that could also work the same way by allowing the user to supply their own address/configuration for the middleman processes and having it send back what discord returns but that might be out of scope for what eris was intended for

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants