Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Adding the ppo trainer #607

Open
Esmail-ibraheem opened this issue Apr 30, 2024 · 0 comments
Open

[Feature Request] Adding the ppo trainer #607

Esmail-ibraheem opened this issue Apr 30, 2024 · 0 comments

Comments

@Esmail-ibraheem
Copy link

Feature Request

Adding the proximal policy optimization (ppo) trainer

Motivation

Applying the ppo trainer, so we can compare between the two trainers: ppo and dpo

Additional Context

No response

@abhishekkrthakur abhishekkrthakur changed the title Adding the ppo trainer [Feature Request] Adding the ppo trainer Apr 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant