Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can you please support context parallel? #162

Open
DZ9 opened this issue Apr 23, 2024 · 0 comments
Open

Can you please support context parallel? #162

DZ9 opened this issue Apr 23, 2024 · 0 comments

Comments

@DZ9
Copy link

DZ9 commented Apr 23, 2024

Nemo suport context parallel in nature, with the adaption in [MegatronGPTModel] (https://github.com/NVIDIA/NeMo/blob/96187eac848ebf02c56e9fc658a57a500a56a842/nemo/collections/nlp/models/language_modeling/megatron_gpt_model.py#L1039) get_forward_output_and_loss_func method.

I found all dpo, ppo and rm model inherit MegatronGPTModel but rewrite the get_forward_output_and_loss_func method without the adaption of context parallel, can you please add the support because it is efficient for long context training.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant