Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature quantization #237

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from
Draft

Feature quantization #237

wants to merge 2 commits into from

Conversation

Jegp
Copy link
Member

@Jegp Jegp commented Aug 15, 2021

Quantization works well with feed forward dynamics. Unfortunately, we used the functional API to implement recurrent layers, which makes it hard to optimize, since PyTorch doesn't recognize the layers.

  1. One approach could be to implement custom quantization primitives, but that would take time. It could, however be necessary if we'd like to fully support quantization on the operations level.
  2. Another idea would be to refactor the current recurrent implementations as an activation + 2 linear layers.

I like the second version, since that's exactly what a recurrent layer is. We could even go further and remove our recurrent functionals, since I suspect few people are using the functions directly and because it reduces maintainance. The current PR contains a suggestion on how the LIFRecurrentCell module could look like - and a test that demonstrates that it works.

I'm eager to hear your opinion @cpehle! May also be interesting for you, @ChFrenkel

@Jegp Jegp added the enhancement New feature or request label Aug 15, 2021
@Jegp Jegp requested a review from cpehle August 15, 2021 20:01
@Jegp Jegp self-assigned this Aug 15, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant