Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About random batch sampling #2

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

About random batch sampling #2

wants to merge 1 commit into from

Conversation

terryum
Copy link

@terryum terryum commented May 17, 2016

To be honest, it's hard to understand the role of if 0: else: statements in the minibatch learning for-loop.
I think using only mnist.train.next_batch(batch_size) can make decent results.

I realized that if I use the Random batch sampling only, the performance decreases from 91-ish to 87-ish.
The reason is that random sampling doesn't exploit the whole set because of duplicated samples.

I changed this part by using np.random.permutation(n_train) to cover all data at each epoch.

To be honest, it's hard to understand the role of  ```if 0:  else:``` statements in the minibatch learning for-loop.
I think using only ```mnist.train.next_batch(batch_size)``` can make decent results.

I realized that if I use the *Random batch sampling* only, the performance decreases from 91-ish to 87-ish.
The reason is that random sampling doesn't exploit the whole set because of duplicated samples.

I changed this part by using ```np.random.permutation(n_train)``` to cover all data at each epoch.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant