Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batch MCTS is needed !!! #11

Open
Nightbringers opened this issue Mar 15, 2024 · 7 comments
Open

Batch MCTS is needed !!! #11

Nightbringers opened this issue Mar 15, 2024 · 7 comments

Comments

@Nightbringers
Copy link

The current batch only among multiple games, not one search batched. for example , if one search use 400 simulations, thoese 400 simulations will run one by one, not bacthed.

@lowrollr
Copy link
Owner

I'm not sure how you'd reconcile/merge search tree states across a single game, as the next MCTS iteration depends on the state reached from the previous one.

If you know of a batching algorithm for this please share 😀

@Nightbringers
Copy link
Author

https://github.com/liuanji/WU-UCT/tree/master

this is one of Batch MCTS algorithm, Three popular parallel MCTS algorithms. LeafP parallelizes the simulation steps, TreeP
uses virtual loss to encourage exploration, and RootP parallelizes the subtrees of the root node.

@lowrollr
Copy link
Owner

Looks interesting, thanks for sharing!

When I have some time I may explore adding some of these ideas, not sure how well it will work with the existing batching paradigm -- answering that will require some more investigation on my end.

@lowrollr
Copy link
Owner

It would be very very neat to be able to batch across many environments as well as across MCTS iterations!

@Nightbringers
Copy link
Author

it will be much faster when use one environment. Training ai need many environments. Human play with ai only use one environment. In this case, ai move will be much faster!

@lowrollr
Copy link
Owner

I agree! This project is mostly focused on training at scale, but nevertheless it could be interesting to allow for a mix of batching across many environments as well as within single tree searches. If I can find a way to go about it that doesn't involve overhauling the core functionality of batched MCTS then I will consider adding it.

@Nightbringers
Copy link
Author

Nightbringers commented Mar 20, 2024

maybe should keep the core functionality of many environments batched MCTS unchange, add a new single tree searches batched MCTS separately at first. Then consider combine this two. This way would be simpler and less errors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants