Skip to content
This repository has been archived by the owner on Sep 1, 2023. It is now read-only.

RES-863: Use 'Layer Norm' instead of 'Batch Norm' on linear networks #967

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

lscheinkman
Copy link
Contributor

@subutai Please review.
The main change in this PR is the switch from BatchNorm to LayerNorm on LinearSDR module.
See https://arxiv.org/abs/1607.06450

@subutai
Copy link
Member

subutai commented Mar 8, 2019

This doesn’t affect any of our existing sparse networks right?

@lscheinkman
Copy link
Contributor Author

If affects LinearSDR. We can now use batch_size=1

@lscheinkman
Copy link
Contributor Author

lscheinkman commented Mar 8, 2019

However we cannot use LayerNorm on CNNSDR2d. It is only good for Linear and RNN networks

htmresearch/frameworks/pytorch/cnn_sdr.py Outdated Show resolved Hide resolved

self.bn = None
if useBatchNorm:
self.bn = nn.BatchNorm1d(self.n, affine=False)
self.bn = nn.LayerNorm(self.n)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How would this affect GSC results?

@@ -81,10 +81,12 @@ def __init__(self,
self.l1 = nn.Linear(inputFeatures, self.n)
self.weightSparsity = weightSparsity
self.learningIterations = 0
if self.k <= 0:
self.k = self.n

self.bn = None
if useBatchNorm:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should probably change the name of the parameter too.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants