Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No performance metrics are recorded. Upstream responds with 502 #2931

Open
rmilesson opened this issue Apr 3, 2024 · 16 comments
Open

No performance metrics are recorded. Upstream responds with 502 #2931

rmilesson opened this issue Apr 3, 2024 · 16 comments

Comments

@rmilesson
Copy link

Self-Hosted Version

24.3.0

CPU Architecture

x86_64

Docker Version

24.0.6

Docker Compose Version

2.21.0

Steps to Reproduce

  1. Setup performance monitoring by following https://docs.sentry.io/platforms/php/performance/
  2. Visit site

Expected Result

Performance metrics should show up for the configured project.

Actual Result

No performance metrics are showing up for the affected projects. Some other projects we have are successfully reporting performance metrics.

For one of the affected projects I'm seeing this in the logs:

sentry-self-hosted-nginx-1                                         | 2024/04/03 06:26:50 [error] 29#29: *69800 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/28/envelope/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/28/envelope/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [03/Apr/2024:06:26:50 +0000] "POST /api/28/envelope/ HTTP/1.0" 502 150 "-" "sentry.php/4.6.1" "192.168.3.172"
sentry-self-hosted-nginx-1                                         | 2024/04/03 06:26:53 [error] 29#29: *69802 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/28/envelope/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/28/envelope/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [03/Apr/2024:06:26:53 +0000] "POST /api/28/envelope/ HTTP/1.0" 502 150 "-" "sentry.php/4.6.1" "192.168.3.172"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [03/Apr/2024:06:26:54 +0000] "POST /api/28/envelope/ HTTP/1.0" 502 150 "-" "sentry.php/4.6.1" "192.168.3.172"
sentry-self-hosted-nginx-1                                         | 2024/04/03 06:26:54 [error] 29#29: *69804 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/28/envelope/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/28/envelope/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 2024/04/03 06:26:58 [error] 29#29: *69806 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/28/envelope/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/28/envelope/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [03/Apr/2024:06:26:58 +0000] "POST /api/28/envelope/ HTTP/1.0" 502 150 "-" "sentry.php/4.6.1" "192.168.3.172"

Event ID

No response

@azaslavsky
Copy link
Contributor

Do other features (like error reporting) work? Can you access the sentry UI via the web frontend?

@rmilesson
Copy link
Author

@azaslavsky Yes. Everything else seems to work correctly and I can access the web frontend without issues.

We have two backend projects where I've implemented performance metrics. Only of them reports any performance metrics and they're implemented exactly the same.

@rmilesson
Copy link
Author

Also, looking at issues on the internal-project on Sentry reports no issues.

@rmilesson
Copy link
Author

@azaslavsky You know what. I just noticed that we stopped receiving anything three days ago. Probably when I upgraded Sentry to 24.3.0.

@rmilesson
Copy link
Author

Here is a larger log output:

sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:55:07 +0000] "GET /api/0/organizations/tankbar/searches/ HTTP/1.0" 200 222 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36" "192.168.3.172"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:55:07 +0000] "GET /api/0/organizations/tankbar/processingissues/?project=88 HTTP/1.0" 200 151 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36" "192.168.3.172"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:55:07 +0000] "GET /api/0/organizations/tankbar/users/?project=88 HTTP/1.0" 200 9783 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36" "192.168.3.172"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:55:07 +0000] "GET /api/0/organizations/tankbar/tags/?project=88&statsPeriod=14d&use_cache=1 HTTP/1.0" 200 1115 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36" "192.168.3.172"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:55:08 +0000] "GET /api/0/organizations/tankbar/issues/?collapse=stats&collapse=unhandled&expand=owners&expand=inbox&limit=25&project=88&query=is%3Aunresolved&savedSearch=1&shortIdLookup=1&statsPeriod=14d HTTP/1.0" 200 3238 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36" "192.168.3.172"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:55:08 +0000] "GET /api/0/organizations/tankbar/issues-count/?project=88&query=is%3Aunresolved%20is%3Afor_review%20assigned_or_suggested%3A%5Bme%2C%20my_teams%2C%20none%5D&query=is%3Aregressed&query=is%3Aescalating&query=is%3Aarchived&statsPeriod=14d HTTP/1.0" 200 136 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36" "192.168.3.172"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:55:08 +0000] "GET /api/0/organizations/tankbar/issues-stats/?groups=69791&groups=78218&groups=76410&project=88&query=is%3Aunresolved&statsPeriod=14d HTTP/1.0" 200 1901 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36" "192.168.3.172"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:55:11 [error] 26#26: *188032 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/1/envelope/?sentry_key=2997fe123085485fb23feef1ff7133fa&sentry_version=7&sentry_client=sentry.javascript.react%2F7.105.0 HTTP/1.0", upstream: "http://172.30.0.51:3000/api/1/envelope/?sentry_key=2997fe123085485fb23feef1ff7133fa&sentry_version=7&sentry_client=sentry.javascript.react%2F7.105.0", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:55:11 +0000] "POST /api/1/envelope/?sentry_key=2997fe123085485fb23feef1ff7133fa&sentry_version=7&sentry_client=sentry.javascript.react%2F7.105.0 HTTP/1.0" 502 552 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36" "192.168.3.172"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:55:14 [error] 26#26: *188034 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/28/envelope/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/28/envelope/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:55:14 +0000] "POST /api/28/envelope/ HTTP/1.0" 502 150 "-" "sentry.php/4.6.1" "192.168.3.101"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:55:24 [error] 26#26: *188036 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/11/store/?sentry_key=35f6d09d4cfc46df95459386c4080eed&sentry_version=7 HTTP/1.0", upstream: "http://172.30.0.51:3000/api/11/store/?sentry_key=35f6d09d4cfc46df95459386c4080eed&sentry_version=7", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:55:24 +0000] "POST /api/11/store/?sentry_key=35f6d09d4cfc46df95459386c4080eed&sentry_version=7 HTTP/1.0" 502 552 "-" "Mozilla/5.0 (X11; CrOS armv7l 13597.84.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.187 Safari/537.36" "85.225.106.94"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:55:35 [error] 26#26: *188038 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/21/store/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/21/store/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:55:35 +0000] "POST /api/21/store/ HTTP/1.0" 502 150 "-" "sentry.php/3.21.0" "195.67.74.142"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:55:42 [error] 26#26: *188040 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/42/envelope/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/42/envelope/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:55:42 +0000] "POST /api/42/envelope/ HTTP/1.0" 502 150 "-" "sentry.php/3.22.1" "195.67.74.134"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:55:53 [error] 26#26: *188042 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/17/envelope/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/17/envelope/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:55:53 +0000] "POST /api/17/envelope/ HTTP/1.0" 502 150 "-" "sentry.php/3.21.0" "185.91.65.41"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:56:07 [error] 26#26: *188044 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/51/store/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/51/store/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:56:07 +0000] "POST /api/51/store/ HTTP/1.0" 502 150 "-" "sentry.php/3.7.0" "195.67.74.133"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:56:07 [error] 26#26: *188046 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/51/store/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/51/store/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:56:07 +0000] "POST /api/51/store/ HTTP/1.0" 502 150 "-" "sentry.php/3.7.0" "195.67.74.133"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:56:07 [error] 26#26: *188048 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/51/store/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/51/store/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:56:07 +0000] "POST /api/51/store/ HTTP/1.0" 502 150 "-" "sentry.php/3.7.0" "195.67.74.133"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:56:07 [error] 26#26: *188050 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/51/store/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/51/store/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:56:07 +0000] "POST /api/51/store/ HTTP/1.0" 502 150 "-" "sentry.php/3.7.0" "195.67.74.133"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:56:09 [error] 26#26: *188052 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/21/store/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/21/store/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:56:09 +0000] "POST /api/21/store/ HTTP/1.0" 502 150 "-" "sentry.php/3.21.0" "195.67.74.142"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:56:22 [error] 26#26: *188054 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/54/store/?sentry_key=c7052809688b40e683ade3912aa8f8d6&sentry_version=7 HTTP/1.0", upstream: "http://172.30.0.51:3000/api/54/store/?sentry_key=c7052809688b40e683ade3912aa8f8d6&sentry_version=7", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:56:22 +0000] "POST /api/54/store/?sentry_key=c7052809688b40e683ade3912aa8f8d6&sentry_version=7 HTTP/1.0" 502 150 "-" "Mozilla/5.0 (iPhone; CPU iPhone OS 17_2_1 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Mobile/21C66 [FBAN/FBIOS;FBAV/443.0.2.38.229;FBBV/543984574;FBDV/iPhone12,1;FBMD/iPhone;FBSN/iOS;FBSV/17.2.1;FBSS/2;FBID/phone;FBLC/sv_SE;FBOP/5;FBRV/546755414]" "78.69.14.191"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:56:27 [error] 26#26: *188056 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/11/store/?sentry_key=35f6d09d4cfc46df95459386c4080eed&sentry_version=7 HTTP/1.0", upstream: "http://172.30.0.51:3000/api/11/store/?sentry_key=35f6d09d4cfc46df95459386c4080eed&sentry_version=7", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:56:27 +0000] "POST /api/11/store/?sentry_key=35f6d09d4cfc46df95459386c4080eed&sentry_version=7 HTTP/1.0" 502 552 "-" "Mozilla/5.0 (X11; CrOS armv7l 13597.84.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.187 Safari/537.36" "85.225.106.94"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:56:37 [error] 26#26: *188058 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/28/envelope/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/28/envelope/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:56:37 +0000] "POST /api/28/envelope/ HTTP/1.0" 502 150 "-" "sentry.php/4.6.1" "192.168.3.172"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:56:44 [error] 26#26: *188060 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/28/envelope/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/28/envelope/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:56:44 +0000] "POST /api/28/envelope/ HTTP/1.0" 502 150 "-" "sentry.php/4.6.1" "192.168.3.172"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:56:48 [error] 26#26: *188062 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/21/store/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/21/store/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:56:48 +0000] "POST /api/21/store/ HTTP/1.0" 502 150 "-" "sentry.php/3.21.0" "195.67.74.142"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:57:05 [error] 29#29: *188064 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/42/envelope/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/42/envelope/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:57:05 +0000] "POST /api/42/envelope/ HTTP/1.0" 502 150 "-" "sentry.php/3.22.1" "195.67.74.134"
sentry-self-hosted-cron-1                                          | 13:57:06 [WARNING] sentry_sdk.errors: Intervals shorter than one minute are not supported by Sentry Crons. Monitor 'deliver-webhooks-control' has an interval of 10 seconds. Use the `exclude_beat_tasks` option in the celery integration to exclude it.
sentry-self-hosted-web-1                                           | 13:57:06 [INFO] sentry.access.api: api.access (method='POST' view='sentry.api.endpoints.relay.project_configs.RelayProjectConfigsEndpoint' response=200 user_id='None' is_app='None' token_type='None' is_frontend_request='False' organization_id='None' auth_id='None' path='/api/0/relays/projectconfigs/' caller_ip='172.30.0.20' user_agent='None' rate_limited='False' rate_limit_category='None' request_duration_seconds=0.016324520111083984 rate_limit_type='DNE' concurrent_limit='None' concurrent_requests='None' reset_time='None' group='None' limit='None' remaining='None')
sentry-self-hosted-cron-1                                          | 13:57:16 [WARNING] sentry_sdk.errors: Intervals shorter than one minute are not supported by Sentry Crons. Monitor 'deliver-webhooks-control' has an interval of 10 seconds. Use the `exclude_beat_tasks` option in the celery integration to exclude it.
sentry-self-hosted-web-1                                           | 13:57:16 [INFO] sentry.access.api: api.access (method='POST' view='sentry.api.endpoints.relay.project_configs.RelayProjectConfigsEndpoint' response=200 user_id='None' is_app='None' token_type='None' is_frontend_request='False' organization_id='None' auth_id='None' path='/api/0/relays/projectconfigs/' caller_ip='172.30.0.20' user_agent='None' rate_limited='False' rate_limit_category='None' request_duration_seconds=0.015629053115844727 rate_limit_type='DNE' concurrent_limit='None' concurrent_requests='None' reset_time='None' group='None' limit='None' remaining='None')
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:57:23 [error] 29#29: *188066 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/73/store/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/73/store/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:57:23 +0000] "POST /api/73/store/ HTTP/1.0" 502 150 "-" "sentry.php/3.4.0" "185.91.65.48"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:57:23 [error] 29#29: *188068 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/73/store/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/73/store/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:57:23 +0000] "POST /api/73/store/ HTTP/1.0" 502 150 "-" "sentry.php/3.4.0" "185.91.65.48"
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:57:24 [error] 29#29: *188070 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/73/store/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/73/store/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:57:24 +0000] "POST /api/73/store/ HTTP/1.0" 502 150 "-" "sentry.php/3.4.0" "185.91.65.48"
sentry-self-hosted-cron-1                                          | 13:57:26 [WARNING] sentry_sdk.errors: Intervals shorter than one minute are not supported by Sentry Crons. Monitor 'deliver-webhooks-control' has an interval of 10 seconds. Use the `exclude_beat_tasks` option in the celery integration to exclude it.
sentry-self-hosted-web-1                                           | 13:57:26 [INFO] sentry.access.api: api.access (method='POST' view='sentry.api.endpoints.relay.project_configs.RelayProjectConfigsEndpoint' response=200 user_id='None' is_app='None' token_type='None' is_frontend_request='False' organization_id='None' auth_id='None' path='/api/0/relays/projectconfigs/' caller_ip='172.30.0.20' user_agent='None' rate_limited='False' rate_limit_category='None' request_duration_seconds=0.016953468322753906 rate_limit_type='DNE' concurrent_limit='None' concurrent_requests='None' reset_time='None' group='None' limit='None' remaining='None')
sentry-self-hosted-nginx-1                                         | 2024/04/05 13:57:26 [error] 29#29: *188072 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/73/store/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/73/store/", host: "sentry.tankbar.com"
sentry-self-hosted-nginx-1                                         | 172.30.0.1 - - [05/Apr/2024:13:57:26 +0000] "POST /api/73/store/ HTTP/1.0" 502 150 "-" "sentry.php/3.4.0" "185.91.65.48"

@rmilesson
Copy link
Author

rmilesson commented Apr 5, 2024

I'm also seeing what I believe to be some Kafka errors intermittently when I run docker compose logs

sentry-self-hosted-post-process-forwarder-errors-1                 | 14:13:43 [INFO] arroyo.processing.processor: New partitions assigned: {Partition(topic=Topic(name='events'), index=0): 26518427}
sentry-self-hosted-post-process-forwarder-errors-1                 | 14:13:43 [INFO] sentry.post_process_forwarder.post_process_forwarder: Starting multithreaded post process forwarder
sentry-self-hosted-post-process-forwarder-errors-1                 | Traceback (most recent call last):
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/arroyo/processing/processor.py", line 322, in run
sentry-self-hosted-post-process-forwarder-errors-1                 |     self._run_once()
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/arroyo/processing/processor.py", line 384, in _run_once
sentry-self-hosted-post-process-forwarder-errors-1                 |     self.__message = self.__consumer.poll(timeout=1.0)
sentry-self-hosted-post-process-forwarder-errors-1                 |                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/sentry/consumers/synchronized.py", line 235, in poll
sentry-self-hosted-post-process-forwarder-errors-1                 |     message = self.__consumer.poll(timeout)
sentry-self-hosted-post-process-forwarder-errors-1                 |               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/arroyo/backends/kafka/consumer.py", line 414, in poll
sentry-self-hosted-post-process-forwarder-errors-1                 |     raise OffsetOutOfRange(str(error))
sentry-self-hosted-post-process-forwarder-errors-1                 | arroyo.errors.OffsetOutOfRange: KafkaError{code=_AUTO_OFFSET_RESET,val=-140,str="fetch failed due to requested offset not available on the broker: Broker: Offset out of range (broker 1001)"}
sentry-self-hosted-post-process-forwarder-errors-1                 | 14:13:43 [ERROR] arroyo.processing.processor: Caught exception, shutting down...
sentry-self-hosted-post-process-forwarder-errors-1                 | 14:13:43 [INFO] arroyo.processing.processor: Closing <sentry.consumers.synchronized.SynchronizedConsumer object at 0x7f75cb002650>...
sentry-self-hosted-post-process-forwarder-errors-1                 | 14:13:43 [INFO] arroyo.processing.processor: Partitions to revoke: [Partition(topic=Topic(name='events'), index=0)]
sentry-self-hosted-post-process-forwarder-errors-1                 | 14:13:43 [INFO] arroyo.processing.processor: Partition revocation complete.
sentry-self-hosted-post-process-forwarder-errors-1                 | 14:13:43 [INFO] arroyo.processing.processor: Processor terminated
sentry-self-hosted-post-process-forwarder-errors-1                 | Traceback (most recent call last):
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/bin/sentry", line 8, in <module>
sentry-self-hosted-post-process-forwarder-errors-1                 |     sys.exit(main())
sentry-self-hosted-post-process-forwarder-errors-1                 |              ^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/sentry/runner/__init__.py", line 190, in main
sentry-self-hosted-post-process-forwarder-errors-1                 |     func(**kwargs)
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1157, in __call__
sentry-self-hosted-post-process-forwarder-errors-1                 |     return self.main(*args, **kwargs)
sentry-self-hosted-post-process-forwarder-errors-1                 |            ^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1078, in main
sentry-self-hosted-post-process-forwarder-errors-1                 |     rv = self.invoke(ctx)
sentry-self-hosted-post-process-forwarder-errors-1                 |          ^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1688, in invoke
sentry-self-hosted-post-process-forwarder-errors-1                 |     return _process_result(sub_ctx.command.invoke(sub_ctx))
sentry-self-hosted-post-process-forwarder-errors-1                 |                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1688, in invoke
sentry-self-hosted-post-process-forwarder-errors-1                 |     return _process_result(sub_ctx.command.invoke(sub_ctx))
sentry-self-hosted-post-process-forwarder-errors-1                 |                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1434, in invoke
sentry-self-hosted-post-process-forwarder-errors-1                 |     return ctx.invoke(self.callback, **ctx.params)
sentry-self-hosted-post-process-forwarder-errors-1                 |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 783, in invoke
sentry-self-hosted-post-process-forwarder-errors-1                 |     return __callback(*args, **kwargs)
sentry-self-hosted-post-process-forwarder-errors-1                 |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/click/decorators.py", line 33, in new_func
sentry-self-hosted-post-process-forwarder-errors-1                 |     return f(get_current_context(), *args, **kwargs)
sentry-self-hosted-post-process-forwarder-errors-1                 |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/sentry/runner/decorators.py", line 69, in inner
sentry-self-hosted-post-process-forwarder-errors-1                 |     return ctx.invoke(f, *args, **kwargs)
sentry-self-hosted-post-process-forwarder-errors-1                 |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 783, in invoke
sentry-self-hosted-post-process-forwarder-errors-1                 |     return __callback(*args, **kwargs)
sentry-self-hosted-post-process-forwarder-errors-1                 |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/click/decorators.py", line 33, in new_func
sentry-self-hosted-post-process-forwarder-errors-1                 |     return f(get_current_context(), *args, **kwargs)
sentry-self-hosted-post-process-forwarder-errors-1                 |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/sentry/runner/decorators.py", line 29, in inner
sentry-self-hosted-post-process-forwarder-errors-1                 |     return ctx.invoke(f, *args, **kwargs)
sentry-self-hosted-post-process-forwarder-errors-1                 |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 783, in invoke
sentry-self-hosted-post-process-forwarder-errors-1                 |     return __callback(*args, **kwargs)
sentry-self-hosted-post-process-forwarder-errors-1                 |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/sentry/runner/commands/run.py", line 448, in basic_consumer
sentry-self-hosted-post-process-forwarder-errors-1                 |     run_processor_with_signals(processor, consumer_name)
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/sentry/utils/kafka.py", line 46, in run_processor_with_signals
sentry-self-hosted-post-process-forwarder-errors-1                 |     processor.run()
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/arroyo/processing/processor.py", line 322, in run
sentry-self-hosted-post-process-forwarder-errors-1                 |     self._run_once()
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/arroyo/processing/processor.py", line 384, in _run_once
sentry-self-hosted-post-process-forwarder-errors-1                 |     self.__message = self.__consumer.poll(timeout=1.0)
sentry-self-hosted-post-process-forwarder-errors-1                 |                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/sentry/consumers/synchronized.py", line 235, in poll
sentry-self-hosted-post-process-forwarder-errors-1                 |     message = self.__consumer.poll(timeout)
sentry-self-hosted-post-process-forwarder-errors-1                 |               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry-self-hosted-post-process-forwarder-errors-1                 |   File "/usr/local/lib/python3.11/site-packages/arroyo/backends/kafka/consumer.py", line 414, in poll
sentry-self-hosted-post-process-forwarder-errors-1                 |     raise OffsetOutOfRange(str(error))
sentry-self-hosted-post-process-forwarder-errors-1                 | arroyo.errors.OffsetOutOfRange: KafkaError{code=_AUTO_OFFSET_RESET,val=-140,str="fetch failed due to requested offset not available on the broker: Broker: Offset out of range (broker 1001)"}
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:57,200] INFO Awaiting socket connections on 0.0.0.0:9092. (kafka.network.Acceptor)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:57,273] INFO [SocketServer brokerId=1001] Created data-plane acceptor and processors for endpoint : EndPoint(0.0.0.0,9092,ListenerName(PLAINTEXT),PLAINTEXT) (kafka.network.SocketServer)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:57,277] INFO [SocketServer brokerId=1001] Started 1 acceptor threads for data-plane (kafka.network.SocketServer)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:57,524] INFO Creating /brokers/ids/1001 (is it secure? false) (kafka.zk.KafkaZkClient)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:57,587] INFO Stat of the created znode at /brokers/ids/1001 is: 5263,5263,1712326257561,1712326257561,1,0,0,73072274353029121,180,0,5263
sentry-self-hosted-kafka-1                                         |  (kafka.zk.KafkaZkClient)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:57,588] INFO Registered broker 1001 at path /brokers/ids/1001 with addresses: ArrayBuffer(EndPoint(kafka,9092,ListenerName(PLAINTEXT),PLAINTEXT)), czxid (broker epoch): 5263 (kafka.zk.KafkaZkClient)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:58,352] INFO [/config/changes-event-process-thread]: Starting (kafka.common.ZkNodeChangeNotificationListener$ChangeEventProcessThread)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:59,336] INFO [SocketServer brokerId=1001] Started data-plane processors for 1 acceptors (kafka.network.SocketServer)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:59,746] ERROR [KafkaApi-1001] Number of alive brokers '0' does not meet the required replication factor '1' for the offsets topic (configured via 'offsets.topic.replication.factor'). This error can be ignored if the cluster is starting up and not all brokers are up yet. (kafka.server.KafkaApis)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:59,758] ERROR [KafkaApi-1001] Number of alive brokers '0' does not meet the required replication factor '1' for the offsets topic (configured via 'offsets.topic.replication.factor'). This error can be ignored if the cluster is starting up and not all brokers are up yet. (kafka.server.KafkaApis)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:59,758] ERROR [KafkaApi-1001] Number of alive brokers '0' does not meet the required replication factor '1' for the offsets topic (configured via 'offsets.topic.replication.factor'). This error can be ignored if the cluster is starting up and not all brokers are up yet. (kafka.server.KafkaApis)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:59,761] ERROR [KafkaApi-1001] Number of alive brokers '0' does not meet the required replication factor '1' for the offsets topic (configured via 'offsets.topic.replication.factor'). This error can be ignored if the cluster is starting up and not all brokers are up yet. (kafka.server.KafkaApis)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:59,761] ERROR [KafkaApi-1001] Number of alive brokers '0' does not meet the required replication factor '1' for the offsets topic (configured via 'offsets.topic.replication.factor'). This error can be ignored if the cluster is starting up and not all brokers are up yet. (kafka.server.KafkaApis)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:10:59,758] ERROR [KafkaApi-1001] Number of alive brokers '0' does not meet the required replication factor '1' for the offsets topic (configured via 'offsets.topic.replication.factor'). This error can be ignored if the cluster is starting up and not all brokers are up yet. (kafka.server.KafkaApis)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:13:56,643] INFO Cleaner 0: Beginning cleaning of log __consumer_offsets-18. (kafka.log.LogCleaner)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:13:56,644] INFO Cleaner 0: Building offset map for __consumer_offsets-18... (kafka.log.LogCleaner)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:13:56,669] INFO Cleaner 0: Building offset map for log __consumer_offsets-18 for 4 segments in offset range [29538, 35331). (kafka.log.LogCleaner)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:13:56,724] INFO Cleaner 0: Offset map for log __consumer_offsets-18 complete. (kafka.log.LogCleaner)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:13:56,725] INFO Cleaner 0: Cleaning log __consumer_offsets-18 (cleaning prior to Fri Sep 29 14:00:08 UTC 2023, discarding tombstones prior to Wed Aug 23 09:29:39 UTC 2023)... (kafka.log.LogCleaner)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:13:56,733] INFO Cleaner 0: Cleaning LogSegment(baseOffset=0, size=1173625, lastModifiedTime=1692869379000, largestTime=1692869379647) in log __consumer_offsets-18 into 0 with deletion horizon 1692782979000, retaining deletes. (kafka.log.LogCleaner)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:13:56,777] INFO Cleaner 0: Cleaning LogSegment(baseOffset=29538, size=361008, lastModifiedTime=1693316980000, largestTime=1693316980035) in log __consumer_offsets-18 into 0 with deletion horizon 1692782979000, retaining deletes. (kafka.log.LogCleaner)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:13:56,783] INFO Cleaner 0: Cleaning LogSegment(baseOffset=31194, size=241980, lastModifiedTime=1694694007000, largestTime=1694694007140) in log __consumer_offsets-18 into 0 with deletion horizon 1692782979000, retaining deletes. (kafka.log.LogCleaner)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:13:56,787] INFO Cleaner 0: Cleaning LogSegment(baseOffset=32304, size=321768, lastModifiedTime=1695645607000, largestTime=1695645607753) in log __consumer_offsets-18 into 0 with deletion horizon 1692782979000, retaining deletes. (kafka.log.LogCleaner)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:13:56,790] INFO Cleaner 0: Cleaning LogSegment(baseOffset=33780, size=338118, lastModifiedTime=1695996008000, largestTime=1695996008081) in log __consumer_offsets-18 into 0 with deletion horizon 1692782979000, retaining deletes. (kafka.log.LogCleaner)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:13:56,818] INFO Cleaner 0: Swapping in cleaned segment LogSegment(baseOffset=0, size=1403414, lastModifiedTime=1695996008000, largestTime=1695996008081) for segment(s) List(LogSegment(baseOffset=0, size=1173625, lastModifiedTime=1692869379000, largestTime=1692869379647), LogSegment(baseOffset=29538, size=361008, lastModifiedTime=1693316980000, largestTime=1693316980035), LogSegment(baseOffset=31194, size=241980, lastModifiedTime=1694694007000, largestTime=1694694007140), LogSegment(baseOffset=32304, size=321768, lastModifiedTime=1695645607000, largestTime=1695645607753), LogSegment(baseOffset=33780, size=338118, lastModifiedTime=1695996008000, largestTime=1695996008081)) in log Log(dir=/var/lib/kafka/data/__consumer_offsets-18, topic=__consumer_offsets, partition=18, highWatermark=35332, lastStableOffset=35332, logStartOffset=0, logEndOffset=35332) (kafka.log.LogCleaner)
sentry-self-hosted-kafka-1                                         | [2024-04-05 14:13:56,826] INFO [kafka-log-cleaner-thread-0]: 
sentry-self-hosted-kafka-1                                         | 	Log cleaner thread 0 cleaned log __consumer_offsets-18 (dirty section = [29538, 29538])
sentry-self-hosted-kafka-1                                         | 	2.3 MB of log processed in 0.2 seconds (12.8 MB/sec).
sentry-self-hosted-kafka-1                                         | 	Indexed 1.2 MB in 0.1 seconds (14.9 Mb/sec, 44.8% of total time)
sentry-self-hosted-kafka-1                                         | 	Buffer utilization: 0.0%
sentry-self-hosted-kafka-1                                         | 	Cleaned 2.3 MB in 0.1 seconds (23.2 Mb/sec, 55.2% of total time)
sentry-self-hosted-kafka-1                                         | 	Start size: 2.3 MB (15,639 messages)
sentry-self-hosted-kafka-1                                         | 	End size: 1.3 MB (11,777 messages)
sentry-self-hosted-kafka-1                                         | 	42.4% size reduction (24.7% fewer messages)
sentry-self-hosted-kafka-1                                         |  (kafka.log.LogCleaner)
sentry-self-hosted-ingest-monitors-1                               | 13:42:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:42:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:44:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:44:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:45:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:45:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:47:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:47:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:48:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:48:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:49:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:49:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:50:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:50:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:51:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:51:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:52:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:52:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:54:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:54:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:55:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:55:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:56:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:56:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:57:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:57:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:58:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:58:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 13:59:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 13:59:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:00:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:00:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:01:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:01:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:02:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:02:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:03:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:03:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:04:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:04:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:05:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:05:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:06:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:06:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:07:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:07:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:08:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:08:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:09:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:09:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:09:31 [INFO] arroyo.processing.processor: Shutdown signalled
sentry-self-hosted-ingest-monitors-1                               | 14:09:31 [INFO] arroyo.processing.processor: Stopping consumer
sentry-self-hosted-ingest-monitors-1                               | 14:09:31 [INFO] arroyo.processing.processor: Partitions to revoke: [Partition(topic=Topic(name='ingest-monitors'), index=0)]
sentry-self-hosted-ingest-monitors-1                               | 14:09:31 [INFO] arroyo.processing.processor: Closing <arroyo.processing.strategies.guard.StrategyGuard object at 0x7f551a559290>...
sentry-self-hosted-ingest-monitors-1                               | 14:09:31 [INFO] arroyo.processing.processor: Waiting for <arroyo.processing.strategies.guard.StrategyGuard object at 0x7f551a559290> to exit...
sentry-self-hosted-ingest-monitors-1                               | 14:09:31 [INFO] arroyo.processing.processor: <arroyo.processing.strategies.guard.StrategyGuard object at 0x7f551a559290> exited successfully, releasing assignment.
sentry-self-hosted-ingest-monitors-1                               | 14:09:31 [INFO] arroyo.processing.processor: Partition revocation complete.
sentry-self-hosted-ingest-monitors-1                               | 14:09:31 [INFO] arroyo.processing.processor: Stopped
sentry-self-hosted-ingest-monitors-1                               | Updating certificates in /etc/ssl/certs...
sentry-self-hosted-ingest-monitors-1                               | 0 added, 0 removed; done.
sentry-self-hosted-ingest-monitors-1                               | Running hooks in /etc/ca-certificates/update.d...
sentry-self-hosted-ingest-monitors-1                               | done.
sentry-self-hosted-ingest-monitors-1                               | sentry/requirements.txt is deprecated, use sentry/enhance-image.sh - see https://github.com/getsentry/self-hosted#enhance-sentry-image
sentry-self-hosted-ingest-monitors-1                               | %3|1712326215.124|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.30.0.20:9092 failed: Connection refused (after 4ms in state CONNECT)
sentry-self-hosted-ingest-monitors-1                               | %3|1712326216.118|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.30.0.20:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed)
sentry-self-hosted-ingest-monitors-1                               | %3|1712326246.140|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.30.0.20:9092 failed: Connection refused (after 3ms in state CONNECT, 30 identical error(s) suppressed)
sentry-self-hosted-ingest-monitors-1                               | 14:11:15 [INFO] arroyo.processing.processor: New partitions assigned: {Partition(topic=Topic(name='ingest-monitors'), index=0): 256992}
sentry-self-hosted-ingest-monitors-1                               | 14:12:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:12:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:13:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:13:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:14:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:14:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:15:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:15:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:16:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:16:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:17:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:17:00+00:00')
sentry-self-hosted-ingest-monitors-1                               | 14:18:00 [INFO] sentry: monitors.consumer.clock_tick (reference_datetime='2024-04-05 14:18:00+00:00')

@rmilesson
Copy link
Author

It seems like things are stuck in a loop? I'm seeing a lot of recurring errors in the log. I do not know if it's just logspam and expected or if it's part of a larger issue.

Attaching a larger logfile:
sentry_docker_compose.log

@aldy505
Copy link
Collaborator

aldy505 commented Apr 6, 2024

For "no performance metrics are recorded", this maybe relates to #2876. What I'd suggest is to replace all occurrences of rust-consumer in docker-compose.yml file to just consumer. Then re-run ./install.sh script.

But let's also take a look at your logs:

arroyo.errors.OffsetOutOfRange: KafkaError{code=_AUTO_OFFSET_RESET,val=-140,str="fetch failed due to requested offset not available on the broker: Broker: Offset out of range (broker 1001)"}

You should try to reset your kafka consumer group. See #1894 (comment)

2024/04/05 13:55:42 [error] 26#26: *188040 connect() failed (111: Connection refused) while connecting to upstream, client: 172.30.0.1, server: , request: "POST /api/42/envelope/ HTTP/1.0", upstream: "http://172.30.0.51:3000/api/42/envelope/", host: "sentry.tankbar.com"

%3|1712326246.140|FAIL|rdkafka#consumer-1| [thrd:kafka:9092/bootstrap]: kafka:9092/bootstrap: Connect to ipv4#172.30.0.20:9092 failed: Connection refused (after 3ms in state CONNECT, 30 identical error(s) suppressed)

Seeing this, you should check if your machine have a good Docker networking configured, probably relates to system's IOps or networking segment.

@karatekaneen
Copy link

We got the exact same error on 24.3.0, will try to upgrade to latest version this week to see if that solves our problems.
Assuming Kafka is the original source of the errors the full log from our instance below:

Kafka logs
[2024-04-19 08:11:52,835] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$)
[2024-04-19 08:12:01,970] WARN The package io.confluent.support.metrics.collectors.FullCollector for collecting the full set of support metrics could not be loaded, so we are reverting to anonymous, basic metric collection. If you are a Confluent customer, please refer to the Confluent Platform documentation, section Proactive Support, on how to activate full metrics collection. (io.confluent.support.metrics.KafkaSupportConfig)
[2024-04-19 08:12:02,031] WARN The support metrics collection feature ("Metrics") of Proactive Support is disabled. (io.confluent.support.metrics.SupportedServerStartable)
[2024-04-19 08:12:14,909] ERROR [Log partition=ingest-replay-recordings-0, dir=/var/lib/kafka/data] Could not find offset index file corresponding to log file /var/lib/kafka/data/ingest-replay-recordings-0/00000000000000026795.log, recovering segment and rebuilding index files... (kafka.log.Log)
[2024-04-19 08:12:15,326] ERROR [Log partition=processed-profiles-0, dir=/var/lib/kafka/data] Could not find offset index file corresponding to log file /var/lib/kafka/data/processed-profiles-0/00000000000000004708.log, recovering segment and rebuilding index files... (kafka.log.Log)
[2024-04-19 08:12:16,339] ERROR [Log partition=outcomes-billing-0, dir=/var/lib/kafka/data] Could not find offset index file corresponding to log file /var/lib/kafka/data/outcomes-billing-0/00000000000000013656.log, recovering segment and rebuilding index files... (kafka.log.Log)
[2024-04-19 08:12:17,548] ERROR [Log partition=profiles-0, dir=/var/lib/kafka/data] Could not find offset index file corresponding to log file /var/lib/kafka/data/profiles-0/00000000000000005287.log, recovering segment and rebuilding index files... (kafka.log.Log)
[2024-04-19 08:12:17,903] ERROR [Log partition=ingest-events-0, dir=/var/lib/kafka/data] Could not find offset index file corresponding to log file /var/lib/kafka/data/ingest-events-0/00000000000000642592.log, recovering segment and rebuilding index files... (kafka.log.Log)
[2024-04-19 08:12:18,052] ERROR [Log partition=ingest-replay-events-0, dir=/var/lib/kafka/data] Could not find offset index file corresponding to log file /var/lib/kafka/data/ingest-replay-events-0/00000000000000026801.log, recovering segment and rebuilding index files... (kafka.log.Log)
[2024-04-19 08:12:18,224] ERROR [Log partition=outcomes-0, dir=/var/lib/kafka/data] Could not find offset index file corresponding to log file /var/lib/kafka/data/outcomes-0/00000000000015162257.log, recovering segment and rebuilding index files... (kafka.log.Log)
[2024-04-19 08:12:18,532] ERROR [Log partition=ingest-sessions-0, dir=/var/lib/kafka/data] Could not find offset index file corresponding to log file /var/lib/kafka/data/ingest-sessions-0/00000000000000771737.log, recovering segment and rebuilding index files... (kafka.log.Log)
[2024-04-19 08:12:20,499] ERROR [Log partition=snuba-commit-log-0, dir=/var/lib/kafka/data] Could not find offset index file corresponding to log file /var/lib/kafka/data/snuba-commit-log-0/00000000000000349422.log, recovering segment and rebuilding index files... (kafka.log.Log)
[2024-04-19 08:12:20,660] ERROR [Log partition=transactions-0, dir=/var/lib/kafka/data] Could not find offset index file corresponding to log file /var/lib/kafka/data/transactions-0/00000000000014122668.log, recovering segment and rebuilding index files... (kafka.log.Log)
[2024-04-19 08:12:20,856] ERROR [Log partition=ingest-transactions-0, dir=/var/lib/kafka/data] Could not find offset index file corresponding to log file /var/lib/kafka/data/ingest-transactions-0/00000000000014339943.log, recovering segment and rebuilding index files... (kafka.log.Log)
[2024-04-19 08:12:21,097] ERROR [Log partition=profiles-call-tree-0, dir=/var/lib/kafka/data] Could not find offset index file corresponding to log file /var/lib/kafka/data/profiles-call-tree-0/00000000000000004708.log, recovering segment and rebuilding index files... (kafka.log.Log)
[2024-04-19 08:12:21,261] ERROR [Log partition=events-0, dir=/var/lib/kafka/data] Could not find offset index file corresponding to log file /var/lib/kafka/data/events-0/00000000000000636874.log, recovering segment and rebuilding index files... (kafka.log.Log)
[2024-04-19 08:12:23,096] INFO Starting the log cleaner (kafka.log.LogCleaner)
[2024-04-19 08:12:24,053] INFO [kafka-log-cleaner-thread-0]: Starting (kafka.log.LogCleaner)
[2024-04-19 08:12:30,052] INFO Awaiting socket connections on 0.0.0.0:9092. (kafka.network.Acceptor)
[2024-04-19 08:12:31,451] INFO [SocketServer brokerId=1001] Created data-plane acceptor and processors for endpoint : EndPoint(0.0.0.0,9092,ListenerName(PLAINTEXT),PLAINTEXT) (kafka.network.SocketServer)
[2024-04-19 08:12:31,489] INFO [SocketServer brokerId=1001] Started 1 acceptor threads for data-plane (kafka.network.SocketServer)
[2024-04-19 08:12:32,284] INFO Creating /brokers/ids/1001 (is it secure? false) (kafka.zk.KafkaZkClient)
[2024-04-19 08:12:32,344] INFO Stat of the created znode at /brokers/ids/1001 is: 5357,5357,1713514352335,1713514352335,1,0,0,72057599696240641,180,0,5357
 (kafka.zk.KafkaZkClient)
[2024-04-19 08:12:32,345] INFO Registered broker 1001 at path /brokers/ids/1001 with addresses: ArrayBuffer(EndPoint(kafka,9092,ListenerName(PLAINTEXT),PLAINTEXT)), czxid (broker epoch): 5357 (kafka.zk.KafkaZkClient)
[2024-04-19 08:12:33,129] INFO [/config/changes-event-process-thread]: Starting (kafka.common.ZkNodeChangeNotificationListener$ChangeEventProcessThread)
[2024-04-19 08:12:33,284] INFO [SocketServer brokerId=1001] Started data-plane processors for 1 acceptors (kafka.network.SocketServer)
[2024-04-19 08:17:48,733] WARN Client session timed out, have not heard from server in 17174ms for sessionid 0x100000151430001 (org.apache.zookeeper.ClientCnxn)
[2024-04-19 08:18:09,288] WARN Unable to reconnect to ZooKeeper service, session 0x100000151430001 has expired (org.apache.zookeeper.ClientCnxn)
[2024-04-19 08:18:17,652] INFO Creating /brokers/ids/1001 (is it secure? false) (kafka.zk.KafkaZkClient)
[2024-04-19 08:18:17,824] INFO Stat of the created znode at /brokers/ids/1001 is: 5363,5363,1713514697751,1713514697751,1,0,0,72057599696240642,180,0,5363
 (kafka.zk.KafkaZkClient)

@azaslavsky
Copy link
Contributor

I would try @aldy505 's suggestions in the comments above. In particular, we've been having issues with rust-consumer that we are trying to identify and work out.

@rmilesson
Copy link
Author

I totally forgot to follow up on this! Sorry!

@aldy505's solution above solved our problems and Sentry runs like a charm :)

@karatekaneen
Copy link

karatekaneen commented Apr 25, 2024

@azaslavsky Tried the solution yesterday and it was working for a couple of hours before the instance stopped capturing issues & performance data.
When I looked at the server before the crash it was running on 100% cpu and memory (4cpu, 12gb mem) while handling 2-3 transactions per second. So this might be triggered by a lack of resources on the server.

I will try to add more juice to the machine and see if that helps. Worth mentioning is that we've been running Sentry with similar load on this machine for a couple of years and in the more recent versions (like, last 6 months) we've noticed that both CPU and memory usage is much higher.

EDIT:

Just checked in on our second instance to compare. It is running 24.2.0 and is under similar load.
It does have 16gb ram (so 4gb more than the problematic instance) and is just cruising along on about 20% cpu and 80% memory. So, for us, it looks like the problems started with 24.3.0

@aldy505
Copy link
Collaborator

aldy505 commented Apr 25, 2024

Just checked in on our second instance to compare. It is running 24.2.0 and is under similar load. It does have 16gb ram (so 4gb more than the problematic instance) and is just cruising along on about 20% cpu and 80% memory. So, for us, it looks like the problems started with 24.3.0

@karatekaneen But on the newer version instance, have you replace all occurrance of rust-consumer into just consumer on the docker-compose.yml file yet? (then re-run sudo ./install.sh)

@karatekaneen
Copy link

@aldy505 Yes I have, I also reset the kafka stuff from this comment.
Not sure what's the cause and effect here since I'm guessing the 24.2.0 and 24.3.1 are quite similar when the Rust consumers are disabled.

Think I have some time next week to do further experiments. I'll try with adding more memory and updating to latest as well as rolling back so the versions are identical and let you know what I figure out

@FrankHeijden
Copy link

@aldy505 's solution works like a charm, CPU was maxing out on the snuba rust-consumers, swapping it to non-rust reduced it drastically:
Screenshot 2024-04-30 at 12 39 07
->
Screenshot 2024-04-30 at 12 39 22

@karatekaneen
Copy link

It worked by adding more juice to the VM and switching to the non-Rust consumers for us as well. However, we also needed to add --no-strict-offset-reset to the post-processors to avoid having everything crashing after a day or two.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Archived in project
Status: No status
Development

No branches or pull requests

6 participants