You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Now that we're switching GLM to expect a single neuron, we want a PopulationGLM.
This will have the sklearn estimator API, and under the hood will just vmap all the methods.
Question is whether users are required to have identical regularizers, observation models, and design matrices for each neuron, or whether they can be pytrees that allow for different values per neuron (Edoardo thinks this can be seamless).
What this will not allow you to do is to fit each neuron a different machine (-- or actually, can do this with shredded input).
Instead of neurons being an additional dimension of the array, they should be the highest-level of a pytree: the dictionary keys or the values in a list.
score should have a kwarg or something to specify how to combine (reduce) across neurons: mean, min, l2_norm.
Related to #65 , because its inputs are the output of the designer (or similarly structured pytrees)
The text was updated successfully, but these errors were encountered:
variable selection is done by masking the coefficient. If X is a 2d array of shape (n_samp, n_feat), the mask is a 2d float array (jax forn GPU doesn't like mixing data types) of 0s and 1s of shape (n_feat, n_neurons), the same as the coeff. If X is a FeaturePytree, the mask is a dict, each key in the dict is a key of X, and mask[key] is an array of floats 1s and 0s, of shape (n_neurons, )
the mask cannot be changed after fit and it is a property of the class
Now that we're switching
GLM
to expect a single neuron, we want aPopulationGLM
.Related to #65 , because its inputs are the output of the designer (or similarly structured pytrees)
The text was updated successfully, but these errors were encountered: