Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Neural noise #274

Open
Pugavkomm opened this issue Dec 1, 2021 · 7 comments
Open

Neural noise #274

Pugavkomm opened this issue Dec 1, 2021 · 7 comments
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@Pugavkomm
Copy link
Contributor

Pugavkomm commented Dec 1, 2021

At the moment, there is a lack of the ability to include neural noise in the neuron model. Usually, white Gaussian noise is added to spike neurons with small standard deviation and zero mean. Usually noise is added as an additional term of the membrane potential.

@Jegp
Copy link
Member

Jegp commented Dec 1, 2021

Hi @Pugavkomm! Thanks for bringing this to our attention.
Do you mean noise for the input? Or as a post-hoc manipulation of the membrane voltage?

Do you have a link/reference to an implementation or a model describing this?

Concerning input noise, I think this could be added as a separate layer, which I would imagine being a little cleaner than adding it directly to the neurons (found this example https://discuss.pytorch.org/t/writing-a-simple-gaussian-noise-layer-in-pytorch/4694/8). But maybe you're talking about other dynamics?

@cpehle cpehle self-assigned this Dec 1, 2021
@cpehle cpehle added the enhancement New feature or request label Dec 1, 2021
@cpehle
Copy link
Member

cpehle commented Dec 1, 2021

This is a good idea, I've considered integrating with https://github.com/google-research/torchsde. It would be relatively straightforward to implement a simplified version of that as well.

@Pugavkomm
Copy link
Contributor Author

Reference
@Jegp, I meant noise in the neurons themselves. A model with such a noise can be viewed:
Spiking recurrent neural networks represent task-relevant neural sequences in rule-dependent computation (eq. 1a).

Explanation
Such noise simulates real neural noise and also allows the system to operate even with zero inputs, which can help implement complex memory systems.

Implementation
Since this noise is inherently an additional current, I think that somehow it can be implemented through some layer. But it is important to note that it is fed directly to the i-th neuron.

@Jegp
Copy link
Member

Jegp commented Dec 8, 2021

Right, so it'll be a 1:1 mapping? I think that could be done as a separate module fairly easily. There is also the option of creating some kind of parameterized module within a neuron layer, but that quickly explodes in complexity if the goal is to implement it for all types of neurons.

@Pugavkomm
Copy link
Contributor Author

Pugavkomm commented Dec 8, 2021

Yes, it will. A 1:1 mapping + noise is a good idea.

@cpehle
Copy link
Member

cpehle commented Dec 9, 2021

I think that makes a lot of sense, a clean way to add this would be to implement a naive https://en.wikipedia.org/wiki/Euler–Maruyama_method stepper, and then rewrite the time integration of the neuron models in terms of this.

@cpehle
Copy link
Member

cpehle commented Dec 9, 2021

The advantage of this approach would be that it a) would be to eliminate the duplication of the solver implementation and that b) Euler and Euler-Maruyama are compatible by just setting the noise term to the "zero" function.

@cpehle cpehle added this to the 0.0.8 milestone Dec 9, 2021
@cpehle cpehle modified the milestones: 0.0.8, Release 0.1.1 Jan 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants