Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Runtime error during backward_pass() of PoolingLayer #72

Open
krworks opened this issue Jan 1, 2020 · 1 comment
Open

Runtime error during backward_pass() of PoolingLayer #72

krworks opened this issue Jan 1, 2020 · 1 comment

Comments

@krworks
Copy link

krworks commented Jan 1, 2020

I greatly appreciate your work and clearly written code which gives incredible insights into the back propagation technique. I've encountered a bit of a bug which is pretty solvable, but I don't want to make a pull request as I'm not sure of default values here.

It's at layers.py:400 (at the end of the line, last param):

accum_grad = column_to_image(accum_grad_col, (batch_size * channels, 1, height, width), self.pool_shape, self.stride, 0)

The last param is supposed to be in the string-style enum format of padding type. It's passing a literal 0 when it should be passing self.padding. PoolingLayer should also have a valid default value for self.padding which is also 0 (which of course causes this same error). In the case of 0 being an acceptable default, that value should be acceptable by the receiving function determine_padding, which is where the error is raised:
pad_h, pad_w = determine_padding(filter_shape, output_shape)

Again, thank you for this repository. Amazing work.

@krworks
Copy link
Author

krworks commented Jan 1, 2020

Just to give a bit more context; I'm instantiating the PoolingLayer via:
clf.add(MaxPooling2D(pool_shape=(2,2), stride=2, padding='same'))

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant