Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create more examples using nilearn #81

Open
mnarayan opened this issue Jan 19, 2017 · 5 comments
Open

Create more examples using nilearn #81

mnarayan opened this issue Jan 19, 2017 · 5 comments
Assignees
Labels

Comments

@mnarayan
Copy link
Member

Since our estimators are sklearn compatible, we can enable those who use nilearn for graphical model estimation to use skggm estimators as well.

This would involve using ConnectivityMeasure from nilearn as a base estimator class (which is designed for dealing with multiple graphical models) and then creating an skggm relevant implementation under the hood. Would be useful for application examples, where a similar procedure would need to be applied in any case.

@jasonlaska
Copy link
Member

jasonlaska commented Jan 19, 2017

This seems like just an example right? How is this different than just showing an examples that includes lines similar to:

from nilearn.connectome import ConnectivityMeasure

cm = ConnectivityMeasure(
    cov_estimator=QuicGraphLassoCV(), 
    kind='precision',
)

cm.fit(...)

@mnarayan
Copy link
Member Author

mnarayan commented Jan 19, 2017 via email

@mnarayan mnarayan changed the title Create plugin for nilearn Create more examples using nilearn Oct 9, 2017
@mnarayan mnarayan added this to the Public Version 0.3 milestone Oct 9, 2017
@dPys
Copy link

dPys commented Jan 30, 2018

@mnarayan -- this was an interesting discussion that I was hoping we could resume?

My two cents on this is that the nilearn folks are correct that single precision matrices for single individuals are not comparable across individuals (i.e. where skggm is compatible). The philosophy behind PyNets, however, is that averages of global features across a distribution of precision matrices built upon multiple node definitions for single individuals are comparable across individuals, where skggm is compatible.

Curious to hear your thoughts on this, and particularly whether you think this impacts how skggm's estimators should be used in such scenarios...

-derek

@mnarayan
Copy link
Member Author

Hi @dpisner453. I think the issue of whether individual graphs are comparable or not depends on
a) The kinds of individual level and population level statistical models assumed
b) The types of loss functions used to compare precision matrices.
c) There are also many flexible intermediate options that are not yet implemented here to optimally pool information across subjects without compromising individual variation. This is ongoing research where there are many open questions and skggm is avenue to benchmark and test these options.
Our initial focus has been on building better default estimators at the level of a single data matrix, not that no pooling ought to happen. We don't have anything implementing estimators for a population of networks yet.

Last I checked nilearn's implementations prefer to assume shared model selection across all individuals and refits that model onto individual subject data. This is quite reasonable for the sorts of applications nilearn is used for like classification/clustering and so forth.

I am familiar with nilearn's ConnectivityMeasure API. Does PyNets have something similar as well? If PyNets has an explicit separate subject estimation philosophy, then our current estimators will work easily for you out of the box.

@dPys
Copy link

dPys commented Jan 31, 2018

Thanks for the response @mnarayan !

Are you referring to population-based network estimators like lasso-penalized D-trace loss?

I should clarify I am not suggesting the use of estimators that pool information across subjects, only those that pool information within subjects.

Our initial focus has been on building better default estimators at the level of a single data matrix, not that no pooling ought to happen. We don't have anything implementing estimators for a population of networks yet.
Fair. And it's not entirely clear, in the the context of structural and functional brain networks at least, the extent to which any such pooling would be better accomplished at the level of individual matrix estimators vs. at the level of the actual stats calculated from those matrices, or both.

Last I checked nilearn's implementations prefer to assume shared model selection across all individuals and refits that model onto individual subject data. This is quite reasonable for the sorts of applications nilearn is used for like classification/clustering and so forth.
Definitely, and I've used Nilearn in precisely that manner in the past. But I'm in the business of studying individual network neurophenotypes where shared model selection can obscure nuanced individual differences in functional networks in particular. To be able to study brain network phenotypes means going beyond black-box approaches like classification (i.e. to also or alternatively extract robust network statistics that can be used with GLM, SEM, etc.)

PyNets does have the ability to access nilearn's ConnectivityMeasure API library of estimators but is also able, as you noted, to use skggm's estimators out-of-the-box. Very much looking forward to seeing what skggm comes up with next.

-derek

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants