-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
hand annotation problems #4
Comments
Hi, The field: The field: If you want to use other dataset for hand-object interactions. You need to parse their annotations to seek a consistent representation in 3D space. Here, I can share my data loaders for F-PHAB, HO3D and DexYCB datasets.
Hope these help! Lixin |
Hi, |
Hi, I have a follow up question about using other datasets in Tink. Now if I want to use other datasets like DexYCB to visualize hand-object interactions with Tink, is it workable to load the dataset using only the above data loaders? I noticed that in Tink, there was a Thanks for your help! |
Also I found that in the 'cal_contact_info.py' of Tink there was a function named "get_hand_parameter", which aimed to process the hand data in the dataset. I want to know whether the mano parameters of hand in other datasets also need to be processed by this function before hand, as I couldn't get reasonable results using DexYCB data when visualizing with Tink. Sorry to have so many questions and thanks for your kindness! |
The coordinate systems used by DexYCB dataset and used by Tink are different. For the image-based DexYCB and OakInk-Image datasets, we aimed to acquire hand pose (16 x 3), hand translation: (through get_joints_3d[center_idx, :]) (3 x 1), and object transform (4 x 4) in the camera's coordinate system, in which we can project the hand and object vertices back to image plane using intrinsic K. For Tink, we aimed to represent the hand mesh in the object's canonical coordinate system. Hope this helps! Lixin |
Dear authors,
Thanks for your awesome work and the dataset!
After learning about your work, I have some problems about hand annotation files and hope you can help me. Thanks a lot!
The file
hand_param.pkl
has fieldshand_pose
,hand_shape
,hand_tsl
andobj_transf
. Can you please explain whathand_tsl
andobj_transf
stand for? What's more, if I want to use annotations from other datasets, how can I get thehand_tsl
andobj_transf
parameters, given joints coordinates, camera parameters, hand poses and hand shapes?Looking forward to your reply!
Thank you!
The text was updated successfully, but these errors were encountered: