Replies: 1 comment 2 replies
-
Hi @LuukvBree , Glad to hear you got Mini-Cheetah working! We would be happy to see some videos and also include it in our suite of environments (if that is of interest/possible). Unfortunately, AMP is not on our immediate roadmap right now. We want to move towards adding support for more manipulation-related things. I haven't used AMP, so I'm not sure what exact changes are needed. It would be great to understand this more so we can try to squeeze them into our development plans accordingly. We are more than happy if you (or someone else) would like to initiate integrating AMP and contribute to the framework :) |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Last month, I started training the locomotion skill of the Mini Cheetah with RL. I have managed to get environments like IsaacGymEnvs and OmniIsaacGymEnvs working for the already implemented robots, but not yet with the Mini Cheetah. When I discovered Orbit, I began implementing the Mini Cheetah within it, and successfully, I can now train the locomotion skill using the rsl_rl train.py script.
Now, in IsaacGymEnvs, there's an example of training with Adversarial Motion Priors. Considering I believe that the preferred framework for future development involving robot learning for Isaac Sim is in Orbit, I was wondering if there are plans to implement an AMP example into the existing frameworks of Orbit, especially since the link to the roadmap is broken.
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions