Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Normalization in preprocessing #18

Open
longbowzhang opened this issue Nov 4, 2020 · 3 comments
Open

Normalization in preprocessing #18

longbowzhang opened this issue Nov 4, 2020 · 3 comments

Comments

@longbowzhang
Copy link

Hi @Shimingyi
Sorry to bother you again. I have a question about the preprocessing part in section 3.3.
image

Intuitively, both the bone length and the joint rotations are estimated based on the positional relations of two kinematic neighbouring joints (please correct me if I am wrong).
But after the 2nd normalization step in the preprocessing, the positional relation between two joints are completely lost since each joint is normalized w.r.t its own mean and std.

Therefore, I am wondering the motivation of the 2nd step, and did you try to train the network w/o the 2nd normalization?

Best.

@Shimingyi
Copy link
Owner

Hi @longbowzhang ,

The normalizaiton is only applied on 2d input and bone_length. When we use them in forward kinematic layer, everything is unnormalized . And also for 3d_pose_gt, we haven't applied this kind of normalization so it won't lost informations.
Normalization in dataloader: code

Best,
Mingyi

@longbowzhang
Copy link
Author

I think I fail to express myself clearly.

The normalizaiton is only applied on 2d input and bone_length. # Yes I got this!

Let me use an extreme case to explain my question.
Assume the trainning data are all videos of static subjects, which means the subject does not move at all across all the frames.
Then after the normalization, the positions of the 2D joints are all 0s. What feed into the network is exactly 0s of shape [N, 17, 2], right?

Therefore, I am wondering whether it is necessary to apply the 2nd normalization step to the 2D joints.

@Shimingyi
Copy link
Owner

OK, I got your idea. I think it's always necessary to scale the inputs to have mean 0 and a variance of 1 in a deeplearning pipleline. And in our previous experiments, we did this comparison and the results show standard normalization will improve the numerical stability for this model.

For better explaining, I will double check it and report the performance in this thread.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants