Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the batch size - 16 / 8 #46

Open
MingxiLi opened this issue Mar 18, 2020 · 1 comment
Open

Question about the batch size - 16 / 8 #46

MingxiLi opened this issue Mar 18, 2020 · 1 comment

Comments

@MingxiLi
Copy link

Hi, thanks for your great work.

I have a question about the batch size you used in the experiments. For protocol 1, you used batch size 16 and 8 for protocol 2. It seems that researchers in the area of head pose estimation prefers small batch size. But as far as I know, the training process can be more stable with a larger batch size.

Did you do any experiments on how batch size affect the final performance of the model?

@shamangary
Copy link
Owner

Hello, I don't have the experiment against the batchsize, but we do observe small batchsize is much better for the head pose learning. Comparing to general understanding of the batchsize, for example image classification, head pose is a shared concept between each training data while general classification contains high-level semantic meaning. Is it possible different natures of different tasks cause such results.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants