Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How do you handle rate limit? #6

Open
Billy1900 opened this issue Mar 9, 2024 · 5 comments
Open

How do you handle rate limit? #6

Billy1900 opened this issue Mar 9, 2024 · 5 comments

Comments

@Billy1900
Copy link

After running several times for a period, you must meet the rate limit (https://developers.binance.com/docs/derivatives/usds-margined-futures/general-info#limits), how do you handle that?

@nkaz001
Copy link
Owner

nkaz001 commented Mar 10, 2024

I'm not sure where you encounter API rate limits since data collection is done through a websocket.
If it occurs from connecting to the websocket, you need to implement backoff logic. If it occurs from fetching snapshots, you might need to rely on natural refreshes(https://databento.com/blog/data-cleaning/) instead of filling gaps every time.
Ultimately, market makers are exempt from API rate limits; it's part of their edge as well.

By the way, I will update it to rely on natural refreshes and implement the change in Rust soon or later.

@Billy1900
Copy link
Author

Billy1900 commented Mar 10, 2024

I'm not sure where you encounter API rate limits since data collection is done through a websocket. If it occurs from connecting to the websocket, you need to implement backoff logic. If it occurs from fetching snapshots, you might need to rely on natural refreshes(https://databento.com/blog/data-cleaning/) instead of filling gaps every time. Ultimately, market makers are exempt from API rate limits; it's part of their edge as well.

By the way, I will update it to rely on natural refreshes and implement the change in Rust soon or later.

I try to collect around 50 um futures stream data, I think I might request too much from my IP. the trace back is as follows:

ERROR - Ratelimited on current request. Sleeping, then trying again. Try fewer Request: https://fapi.binance.com/fapi/v1/depth?symbol=agldusdt&limit=1000&timestamp=1709775440645 
 "symbol=agldusdt&limit=1000&timestamp=1709775440645"

WARNING - Canceling all known orders in the meantime.
ERROR - Sleeping for 5 seconds.
ERROR - Task exception was never retrieved
future: <Task finished name='Task-925' coro=<BinanceFutures.__get_marketdepth_snapshot() done, defined at Binance-Feed-Data-collector/collect/binancefutures.py:196> exception=TypeError("'str' object does not support item assignment")>
Traceback (most recent call last):
  File "/Binance-Feed-Data-collector/collect/binancefutures.py", line 108, in __curl
    logging.info("sending req to %s: %s" % (url, json.dumps(query or query or '')))
  File ".local/lib/python3.10/site-packages/aiohttp/client_reqrep.py", line 1060, in raise_for_status
    raise ClientResponseError(
aiohttp.client_exceptions.ClientResponseError: 429, message='Too Many Requests', url=URL('https://fapi.binance.com/fapi/v1/depth?symbol=agldusdt&limit=1000&timestamp=1709775440645')

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "Binance-Feed-Data-collector/collect/binancefutures.py", line 197, in __get_marketdepth_snapshot
    await self.client.close()
  File "Binance-Feed-Data-collector/collect/binancefutures.py", line 122, in __curl
    time.sleep(to_sleep)
  File "Binance-Feed-Data-collector/collect/binancefutures.py", line 86, in __curl
    query = {}
TypeError: 'str' object does not support item assignment

@nkaz001
Copy link
Owner

nkaz001 commented Mar 10, 2024

You need to collect multiple assets from different IP addresses or distribute the fetching of snapshot timing. or you can rely on natural refreshes so get rid of fetching snapshot.

@Billy1900
Copy link
Author

You need to collect multiple assets from different IP addresses or distribute the fetching of snapshot timing. or you can rely on natural refreshes so get rid of fetching snapshot.

Thanks, can you specify more about the natural refreshes technically? I do not get a lot of useful information from the link you give.

@nkaz001
Copy link
Owner

nkaz001 commented Mar 11, 2024

There was an article that explained the concept of 'natural refresh' in detail, but I cannot find it now. Anyway, natural refresh, in a liquid asset, occurs due to the high turnover of market depth messages. Even though there may be missing messages, resulting in a message gap, the lost depth information is quickly recovered by the new depth information.

So you may not need to call the REST API to fetch a snapshot in order to fill the gap.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants