Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does Texar support multi-gpus? #232

Open
buptcjj opened this issue Oct 14, 2019 · 3 comments · May be fixed by #281
Open

Does Texar support multi-gpus? #232

buptcjj opened this issue Oct 14, 2019 · 3 comments · May be fixed by #281
Labels
enhancement New feature or request help wanted Extra attention is needed question Further information is requested

Comments

@buptcjj
Copy link

buptcjj commented Oct 14, 2019

Does Texar support multi-gpus?

@huzecong
Copy link
Collaborator

I'm assuming by multi-GPU you mean something similar to DataParallel or DistributedDataParallel. We support multi-GPU to some extent -- The modules we provide are generally device agnostic so they could be put on any device, and should play nicely with DP or DDP. However:

  1. The data modules and Executor do not support multi-GPU. You'll need to manually move data around.
  2. It might not be possible to put different parts of a model on different GPUs, or at least, it's not possible automatically.

@huzecong huzecong added the question Further information is requested label Oct 14, 2019
@ZhitingHu
Copy link
Member

It's easy to run on multi-/distributed-GPUs, e.g., with Horovod. Here are two examples of using Texar with Horovod (by simply adding a couple of extra lines of code to your single-GPU Texar code): here and here.

The examples are on Texar-TF, but Horovod also supports PyTorch.

@YongtaoGe
Copy link

@ZhitingHu could you add an example for using horovod in texar-torch?

@ZhitingHu ZhitingHu added enhancement New feature or request help wanted Extra attention is needed labels Nov 22, 2019
@gpengzhi gpengzhi linked a pull request Dec 27, 2019 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed question Further information is requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants