Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature request] Support SAME padding for convolutional layers #7549

Closed
zuoxingdong opened this issue May 14, 2018 · 5 comments
Closed

[Feature request] Support SAME padding for convolutional layers #7549

zuoxingdong opened this issue May 14, 2018 · 5 comments

Comments

@zuoxingdong
Copy link
Contributor

zuoxingdong commented May 14, 2018

It would be very convenient to have SAME padding supported for convolutional layers

@zuoxingdong zuoxingdong changed the title [Feature request] Support same padding for convolutional layers [Feature request] Support SAME padding for convolutional layers May 14, 2018
@zou3519
Copy link
Contributor

zou3519 commented May 14, 2018

What is "same" padding? Could you give an example?

@zuoxingdong
Copy link
Contributor Author

@zou3519 For example, it could be easier to have the output size identical to input size if the stride is 1. More generally,

out_H = torch.ceil(in_H/stride)
out_W = torch.ceil(in_W/stride)

@crcrpar
Copy link
Collaborator

crcrpar commented May 14, 2018

"SAME" padding option is used in TensorFlow. It's explained in below document.
https://www.tensorflow.org/api_guides/python/nn#Convolution

@ssnl
Copy link
Collaborator

ssnl commented May 14, 2018

I'm personally not a fan of "same" as a padding option. It hides some degrees of freedom from user, e.g. when kernel size is even.

@fmassa
Copy link
Member

fmassa commented May 16, 2018

Duplicate of #3867
I personally feel that we should not support this except if the behavior exactly matches tensorflow.
But for that, we would need to have asymmetric padding for our operations, which would involve some non-trivial changes. I've discussed those points in more detail in #3867

Closing as a duplicate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants