Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pipeline for mean-field approximation of spiking networks #179

Open
jajcayn opened this issue Aug 4, 2021 · 4 comments
Open

Pipeline for mean-field approximation of spiking networks #179

jajcayn opened this issue Aug 4, 2021 · 4 comments
Assignees
Labels
dev-talk Why, how, when, what, you sure? enhancement Enhancing neurolib.

Comments

@jajcayn
Copy link
Collaborator

jajcayn commented Aug 4, 2021

I recently worked out the mean-field approximation of the spiking model of the hippocampus, more-or-less successfully. This process can be automated to a large extent.

The idea is to allow users to go from their spiking networks with multiple populations (LIF/EIF/AdEx neurons + conductance-/current-based synapses) to mean-field approximation in the light of ALN. The actual equations for the mean-field model are the same (slight adjustments w.r.t number of population), the only difference would be the precomputed linear-nonlinear cascade and the model parameters. The cascade computation depends on single neuron parameters in the spiking model and the parameters of the mean-field approximation depend on both the synaptic parameters and the network parameters (number of neurons, probability of connection, etc.).

Some things to consider:

  • code for computing the cascade is already part of neurolib in models/aln/aln-precalc/precompute_quantitites/ so I'd refactor a bit and move it somewhere else
  • the cascade computation should work as a single function with arguments as per single neuron parameters, i.e. user enters necessary neuronal parameters into the function and the function does everything and saves the cascade
  • I'd create something like BaseMeanFieldNeuralMass on par with base for NeuralMass in MultiModel.. it'd contain the dynamics based on number of populations, etc
  • some functions will be necessary to compute the mean-field parameters (the Ks, Js, cs, etc)... these would live under the same subpackage as the code for cascade computation and would save params as jsons next to the cascade - jsons, because default model parameters in MultiModel are dicts and json is practically a dict...
  • finally a final model would be created by subclassing the BaseMeanFieldNeuralMass into a new mean-field model.. user would define the name of the model, some basic attributes, like default noise input, etc. but the dynamics would be the same as ALN, the parameters would be loaded from json based on computation and the linear-nonlinear cascade would be loaded based on the computation
  • profit!

Ideas? Critique?

@jajcayn jajcayn added dev-talk Why, how, when, what, you sure? feature request I have a wish! enhancement Enhancing neurolib. labels Aug 4, 2021
@jajcayn jajcayn self-assigned this Aug 4, 2021
@jajcayn jajcayn removed the feature request I have a wish! label Aug 4, 2021
@caglorithm
Copy link
Member

Yessss I was waiting for this issue ❤️ Some comments on your comments:

  • code for computing the cascade is already part of neurolib in models/aln/aln-precalc/precompute_quantitites/ so I'd refactor a bit and move it somewhere else
  • the cascade computation should work as a single function with arguments as per single neuron parameters, i.e. user enters necessary neuronal parameters into the function and the function does everything and saves the cascade

It would be great to have a nicer wrapper around the procompute functions, I agree. It should ideally stay part of the same package I think, since it's not really part of the core of neurolib (rather "just" one of the models, namely aln).

  • I'd create something like BaseMeanFieldNeuralMass on par with base for NeuralMass in MultiModel.. it'd contain the dynamics based on number of populations, etc

I don't get this, can you explain? What does the computation have to do with MultiModel? In any case, we should first work on the native ALNModel if we extend it and then port it to the MultiModel implementations of it. Right now ALNModel already accepts custom transfer functions, so idk exactly what this is referring to.

  • some functions will be necessary to compute the mean-field parameters (the Ks, Js, cs, etc)... these would live under the same subpackage as the code for cascade computation and would save params as jsons next to the cascade - jsons, because default model parameters in MultiModel are dicts and json is practically a dict...

I'm a bit skeptical on this point. K's and J's do not affect the computation of the quantities, that's why they don't need to be remembered. I would discourage saving additional files (parameters) in another format (json) if unnecessary. The quantities.hdftable should contain all information.

finally a final model would be created by subclassing the BaseMeanFieldNeuralMass into a new mean-field model.. user would define the name of the model, some basic attributes, like default noise input, etc. but the dynamics would be the same as ALN, the parameters would be loaded from json based on computation and the linear-nonlinear cascade would be loaded based on the computation

How is this different from the existing ALN Model with different quantities? A BaseMeanFieldNeuralMass sounds very general, there are many different kinds of mean field models out there. As you know, the one that we are using is a LN cascade model, that's why every model with this cascade will be just a version of the ALN Model.

My thoughts:

  • We should avoid adding too much complexity at once.
  • Right now it seems like the ALNModel can do all this (and should be actually the model that does it all, not only with one set of parameters but with any set).
  • I think the focusing on extending the ALNModel can be more fruitful. What seems to be especially missing is having multiple populations with different transfer functions. Is this what you are referring to mostly?

@jajcayn
Copy link
Collaborator Author

jajcayn commented Aug 5, 2021

I don't get this, can you explain? What does the computation have to do with MultiModel? In any case, we should first work on the native ALNModel if we extend it and then port it to the MultiModel implementations of it. Right now ALNModel already accepts custom transfer functions, so idk exactly what this is referring to.

My initial thought was to make this under the MultiModel framework. The idea was in the native ALN everything is hardcoded. Yes, it can accept different lin-nonlin cascade, and different parameters, but refactoring such that you also set a number of populations within one node (e.g. the hippocampal MF model has 3) would be hard... Within the MultiModel framework I already have ideas on how to do it. Anyway, it'd be nice to have this also in the native ALN model, I agree. I will think more about how to do it. (The problem is - the integration is hardcoded and jitted with numba with 2 populations... At this point I do not see an easy workaround to allow N populations within one node)

I'm a bit skeptical on this point. K's and J's do not affect the computation of the quantities, that's why they don't need to be remembered. I would discourage saving additional files (parameters) in another format (json) if unnecessary. The quantities.hdftable should contain all information.

well yes: the table contains the transfer function for r, V and tau.. However, other model parameters such as K, J etc also depend on spiking network parameters. So yes - user can set this up using model.params, but they don't know how :) The functions would be simply to compute parameters as J, K etc from parameters of spiking network model. Saving to json is probably an overkill, but the idea is: user interest all necessary parameters of their spiking network model and the pipeline figures out not only the cascade, but also other mean-field parameters like c, J and K. Moreover, e.g. je c and J are more-or-less trivial when your spiking network model has current-based synapses. But in the case of conductance-based synapses, you actually need to compute c as a the maximum PSC

How is this different from the existing ALN Model with different quantities? A BaseMeanFieldNeuralMass sounds very general, there are many different kinds of mean field models out there. As you know, the one that we are using is a LN cascade model, that's why every model with this cascade will be just a version of the ALN Model.

yes, but see my comments up: problems are: number of populations within one ALN node; functions for computing ALN parameters as c, J, K etc...

  • We should avoid adding too much complexity at once.

yes! agreed! I'd definitely divide this work into more PRs

  • Right now it seems like the ALNModel can do all this (and should be actually the model that does it all, not only with one set of parameters but with any set).

yes and no. Yes in the sense that you can just compute different cascades and work out the parameters of ALN. But it is necessary to include an option for more than 2 populations within one node inside the ALN's timeIntegration.py

  • I think the focusing on extending the ALNModel can be more fruitful. What seems to be especially missing is having multiple populations with different transfer functions. Is this what you are referring to mostly?

yes, two main points: extending existing ALN to include more populations, based on the user's spiking network model, plus having functions to derive ALN parameters (not necessary saving to json) from spiking network parameters at hand

@caglorithm
Copy link
Member

caglorithm commented Aug 5, 2021

The idea was in the native ALN everything is hardcoded.

So you want to build a modular ALNModel, is that the goal? With a varying number of populations? I thought that's exactly what MultiModel is for, I think I don't get it. You could simply create multiple ALNMass models and connect them the usual MultiModel way, but with support for multiple transfer functions.

yes, but see my comments up: problems are: number of populations within one ALN node; functions for computing ALN parameters as c, J, K etc...

The parameters c, J, and K are the same as in a spiking network, they don't need to be computed from anything. And they do not affect the transfer functions. The transfer function is computed from a single neuron. You're referring to "deriving mean field model for a spiking network". I think it's more precise to say that the ALNModel is an approximation of the AdEx population with synapses. We can't approximate any spiking network with the ALNModel.

I don't think it makes a lot of sense to automate every step. The use case is just too special. If someone wants to make a new model, they should exactly know what they are doing and build the model accordingly. If we just allow any parameter configuration to be translated into an ALNModel, we would have to do a lot of testing. We don't even know in which parameter regimes the ALNModel is valid or not, it's all been tested only in one or two regimes.

So all in all, I would suggest the following, just to save energy and time. Obviously this is just a suggestion:

  • Let's build a wrapper around the cascade computation.
  • Let's get multiple transfer functions going for ALNModel with N=2 and MultiModel with N = \inf.

The rest should be the researchers job. Including verifying if the mean-field approximation is valid/working at all, which would take immense amount of time for each constructed model and probably result in a research paper.

@jajcayn
Copy link
Collaborator Author

jajcayn commented Aug 5, 2021

I agree with all points.

My initial thoughts were in the direction of:

  • you can compute the cascade for AdEx neurons and the cascade is the same for EIF neurons (just do not include adaptation in the mean-field) and it can be also computed for LIF neurons when you set delta_T (the exponential parameter) to zero in the computation of the cascade. I cannot think of any reason why that would not work.
  • when your original spiking network has current-based synapses, then yes - the parameters c, J, and K are the same as in the spiking network. However, when your original spiking network operates with conductance-based synapses, you can compute the c, J, and K parameters as well.. e.g. in the original spiking network, you know that capacitance is 200pF, a unit increase of conductance for AMPA presynaptic spike is 0.05nS, E_AMPA is 0mV and V_threshold for spiking is -50mV then you know that single AMPA spike elicits 3pA in the postsynaptic neuron.. hence you can compute that the c between presynaptic and postsynaptic population has to be 0.015 mV/ms in the current-based definition because that will elicit the same postsynaptic current... similarly, when computing J (maximum postsynaptic current) you can compute it as if all presynaptic neurons are spiking at the same time hence <number of presyn. neurons> * <probability of connection> * <unit increase of conductance> * (E_rev - V) / <capacitance>... in other words, you can work out parameters even when your original spiking network is not current-based but rather conductance-based.

But you are probably right, that you cannot approximate every spiking network with ALN model just using different parameters and different cascades and optionally not using adaptation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dev-talk Why, how, when, what, you sure? enhancement Enhancing neurolib.
Projects
None yet
Development

No branches or pull requests

2 participants