Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Remote control RL processes #47

Open
damajor opened this issue May 11, 2022 · 4 comments
Open

Feature: Remote control RL processes #47

damajor opened this issue May 11, 2022 · 4 comments

Comments

@damajor
Copy link

damajor commented May 11, 2022

In order to train on high-end hardware (high core density CPUs or multiple professional GPUs) would it be possible to dissociate game execution/control/metrics gathering from the compute part ?

@AechPro
Copy link
Collaborator

AechPro commented May 12, 2022

I'm not sure I understand the question. If you're asking if the game and RLGym can be run on a compute cluster, the answer is that it probably can't be without a lot of effort. Rocket League really wants a GPU and Windows.

@damajor
Copy link
Author

damajor commented May 12, 2022

Well having a windows agent to manage RL itself and communicate with linux compute cluster for training. Inference can be done on windows.

@AechPro
Copy link
Collaborator

AechPro commented May 12, 2022

This doesn't sound like an RLGym question. RLGym is just an interface to rocket league, it does not train anything. If you run an agent on Windows and use it to interact with RLGym you can do whatever you want with that data afterwards.

@damajor
Copy link
Author

damajor commented May 12, 2022

Humm well nevermind I need to get my hands on it before arguing :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants