Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

torch.distributed with single process #773

Open
linshokaku opened this issue Oct 27, 2023 · 0 comments · May be fixed by #777
Open

torch.distributed with single process #773

linshokaku opened this issue Oct 27, 2023 · 0 comments · May be fixed by #777
Assignees
Labels
cat:enhancement New feature or request

Comments

@linshokaku
Copy link
Member

if world_size > 1 and not torch.distributed.is_initialized(): # type: ignore
torch.distributed.init_process_group( # type: ignore
backend, init_method=init_method, world_size=world_size, rank=rank
)
torch.distributed.barrier() # type: ignore

I think torch.distributed.init_process_group() can be executed since there is no error even in the world_size=1 state.

By allowing this to be executed even in the world_size=1 state, it is possible to check the operation with respect to the function assuming that torch.distributed.is_initialized() is True, without having to run MPI.

@takagi takagi added the cat:enhancement New feature or request label Oct 30, 2023
@linshokaku linshokaku linked a pull request Oct 30, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cat:enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants