Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature suggestion: naive convolution : gauss trick #20

Open
pfeatherstone opened this issue Jul 26, 2021 · 2 comments
Open

Feature suggestion: naive convolution : gauss trick #20

pfeatherstone opened this issue Jul 26, 2021 · 2 comments
Assignees
Labels
enhancement New feature or request

Comments

@pfeatherstone
Copy link

How about use this for naive convolution and reduce 4 convs down to 3

@ivannz
Copy link
Owner

ivannz commented Jul 28, 2021

@pfeatherstone why not) Although as the linked wiki states

There is a trade-off in that there may be some loss of precision when using floating point.
So faster convolutions come at a cost.

cplxmodule currently uses cplx.conv_quick for most convolutions (non-grouped), which uses two calls to conv at the cost of extra concatenation and slicing steps and, hence, copying and memory storage.

On the other hand cplxmodule currently uses the naïve four-op implementation for cplx.inear, although i've got both the Gauss-trick and concatenation version implemented and tested as well.

Unfortunately, i did not design a convenient mechanism in cplxmodule for changing the operations' underlying kernels used in the layers. So for now the selection is hardcoded to specific implementations (linear, bilinear, and transposed conv).

@ivannz
Copy link
Owner

ivannz commented Jul 28, 2021

i have just pushed a commit to the master, implementing and testing the stuff. However, please, keep in mind the last paragraph of my previous response: currently you will have to manually change a couple of lines in cplxmodule/cplx.py

@ivannz ivannz self-assigned this Jul 28, 2021
@ivannz ivannz added the enhancement New feature or request label Jul 28, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants