Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Speculative mix-ahead #356

Open
olofson opened this issue Oct 12, 2022 · 0 comments
Open

Speculative mix-ahead #356

olofson opened this issue Oct 12, 2022 · 0 comments
Labels
multitdreading Issues related to distributing work across CPU threads optimization Code optimizations, and performance oriented refactoring

Comments

@olofson
Copy link
Owner

olofson commented Oct 12, 2022

Not sure if this is viable or necessary in this day and age, but the idea might be useful in some odd scenarios...

Anyway, the basic idea is to steal the pipeline branch prediction concept from CPUs. In our case, we essentially assume that no further events will arrive for the foreseeable time, so we continuously take snapshots of the current subgraph state, and then allow processing to run ahead from there, buffered, potentially in a background thread.

If a new event does arrive, we can reset the state of the subgraph, and resume processing from there, discarding any buffered output.

@olofson olofson added optimization Code optimizations, and performance oriented refactoring multitdreading Issues related to distributing work across CPU threads labels Oct 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
multitdreading Issues related to distributing work across CPU threads optimization Code optimizations, and performance oriented refactoring
Projects
None yet
Development

No branches or pull requests

1 participant