Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

limit the 'beacon chain is in inactivity leak' message #8064

Closed
rolfyone opened this issue Mar 11, 2024 · 5 comments · Fixed by #8324
Closed

limit the 'beacon chain is in inactivity leak' message #8064

rolfyone opened this issue Mar 11, 2024 · 5 comments · Fixed by #8324
Assignees

Comments

@rolfyone
Copy link
Contributor

		2024-03-12 08:19:00.047	
{"@timestamp":"2024-03-11T22:19:00,014","level":"INFO","thread":"TimeTick-1","class":"teku-event-log","message":"Syncing     *** Target slot: 7807295, Head slot: 7772295, Remaining slots: 35000, Connected peers: 95","throwable":""}

2024-03-12 08:18:59.796	
{"@timestamp":"2024-03-11T22:18:59,732","level":"INFO","thread":"beaconchain-async-7","class":"AbstractEpochProcessor","message":"Beacon chain is in inactivity leak","throwable":""}
2024-03-12 08:18:58.543	
{"@timestamp":"2024-03-11T22:18:58,323","level":"INFO","thread":"beaconchain-async-7","class":"AbstractEpochProcessor","message":"Beacon chain is in inactivity leak","throwable":""}
2024-03-12 08:18:56.788	
{"@timestamp":"2024-03-11T22:18:56,640","level":"INFO","thread":"beaconchain-async-7","class":"AbstractEpochProcessor","message":"Beacon chain is in inactivity leak","throwable":""}
2024-03-12 08:18:55.536	
{"@timestamp":"2024-03-11T22:18:55,303","level":"INFO","thread":"beaconchain-async-7","class":"AbstractEpochProcessor","message":"Beacon chain is in inactivity leak","throwable":""}
2024-03-12 08:18:53.531	
{"@timestamp":"2024-03-11T22:18:53,327","level":"INFO","thread":"beaconchain-async-7","class":"AbstractEpochProcessor","message":"Beacon chain is in inactivity leak","throwable":""}
2024-03-12 08:18:52.027	
{"@timestamp":"2024-03-11T22:18:51,899","level":"INFO","thread":"beaconchain-async-7","class":"AbstractEpochProcessor","message":"Beacon chain is in inactivity leak","throwable":""}
2024-03-12 08:18:50.524	
{"@timestamp":"2024-03-11T22:18:50,288","level":"INFO","thread":"beaconchain-async-7","class":"AbstractEpochProcessor","message":"Beacon chain is in inactivity leak","throwable":""}
2024-03-12 08:18:49.021	
{"@timestamp":"2024-03-11T22:18:48,774","level":"INFO","thread":"beaconchain-async-7","class":"AbstractEpochProcessor","message":"Beacon chain is in inactivity leak","throwable":""}
2024-03-12 08:18:48.019	
{"@timestamp":"2024-03-11T22:18:48,014","level":"INFO","thread":"TimeTick-1","class":"teku-event-log","message":"Syncing     *** Target slot: 7807294, Head slot: 7772295, Remaining slots: 34999, Connected peers: 95","throwable":""}

It doesnt seem totally consistnet, but we only need this every epoch once...

@mehdi-aouadi
Copy link
Contributor

Isn't this a duplicate of #7329 fixed by #7788?

@mehdi-aouadi
Copy link
Contributor

mehdi-aouadi commented Apr 12, 2024

@rolfyone I'm unable to reproduce this behaviour. In which box did you find these logs?
I checked our goerli nodes and the logs are well throttled as expected (every epoch).
The two boxes where I found the same logs are goerli-vc-03 and goerli-vc-01. In both the logs happen every epoch only.
Looked at the code too, I didn't see any initialisation issue. Curious to know when did that happen...

@lucassaldanha
Copy link
Member

lucassaldanha commented May 6, 2024

It is possible that this was fixed by #7788. If we can't find it happening on any instance after we released #7788 we can probably close this issue.

@rolfyone
Copy link
Contributor Author

rolfyone commented May 7, 2024

IIRC theres an initialization issue where if the epoch (slot?) was unset then it'd continue to spam, we may just need to cater for that case where it's unset...

@zilm13
Copy link
Contributor

zilm13 commented May 21, 2024

I saw it when testing PeerDAS, so is not fixed probably with #7788

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants