Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose a batch rate limit request #168

Open
clarkds14 opened this issue Sep 2, 2020 · 3 comments
Open

Expose a batch rate limit request #168

clarkds14 opened this issue Sep 2, 2020 · 3 comments

Comments

@clarkds14
Copy link

There are situations where users would like to be able to send a batch request to minimize network roundtrips. The two types of requests that come to mind:

  1. Sending rate limit requests for more than 1 domain/key/value tuples.
  2. Sending rate limit requests for the same domain/key/value treated as if they were separate requests. There is no work around for this with the current API because summing requests could potentially put you over limit for the entire batch.
@mattklein123
Copy link
Member

Can you propose an API change for discussion?

@clarkds14
Copy link
Author

clarkds14 commented Sep 3, 2020

The gist would be something like:

rpc BatchShouldRateLimit(BatchRateLimitRequest) returns (BatchRateLimitResponse)

message BatchRateLimitRequest {
  repeated RateLimitRequest requests = 1;
}

message BatchRateLimitResponse {
  repeated RateLimitResponse responses = 1;
}

where the order of the responses matches the order of the requests. There are fields that can probably be pulled out of the inner message onto the top-level message and coalesced (e.g. response_headers_to_add) but otherwise this would achieve the goal.

@mattklein123
Copy link
Member

Sure SGTM. Marking help wanted.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants