Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question About Batch_size and Epoch #260

Open
EmirhanToprak opened this issue Jun 30, 2021 · 0 comments
Open

Question About Batch_size and Epoch #260

EmirhanToprak opened this issue Jun 30, 2021 · 0 comments

Comments

@EmirhanToprak
Copy link

Hi, i implemented the code to my dataset.
I have 360 video in total for 2 class (270 training 90 for validation).
I select the "sample duration" = 32

I didn't quite understand the batch size and epoch.
When i set batch_size = 12, training dataset for per epoch = 270/12 = 23 video

  • So, the batch size is, from 23 video, select 12 frame and train?

  • This selected 23 video, is it selected randomly per epoch from 270 video? or is it the same 23 video for full 200 epoch training?

Can anyone explain please? Thank you.

The opts.py i selected
parser.add_argument('--sample_size',
default=224,
type=int,
help='Height and width of inputs')
parser.add_argument('--sample_duration',
default=32,
type=int,
help='Temporal duration of inputs')
parser.add_argument('--batch_size',
default=12,
type=int,
help='Batch Size')
parser.add_argument('--n_val_samples',
default=12,
type=int,
help='Number of validation samples for each activity')

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant