Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pseudolikelihood estimators #58

Open
mnarayan opened this issue Oct 9, 2016 · 1 comment
Open

Pseudolikelihood estimators #58

mnarayan opened this issue Oct 9, 2016 · 1 comment

Comments

@mnarayan
Copy link
Member

mnarayan commented Oct 9, 2016

sklearn.linear_model has all the ingredients to implement the "pseudolikelihood" regression version of inverse covariance estimation that doesn't have the wrapper block coordinate descent. It is literally p uncoupled high dimensional regressions with some extra steps to symmetry and positive definiteness using cholesky decompositions.

It would be easy for us to implement a pseudolikelihood estimator that does this properly using sklearn.linear_model functions as the internal. This can serve as a comparison to loglikelihood type estimators that QUIC-dirty gives us for mixed norm penalties

To be fleshed out more

References:

@mnarayan
Copy link
Member Author

This would wrap around both sklearn's linear model estimators (for Gaussian data) and pyglmnet for non-Gaussian data https://github.com/pavanramkumar/pyglmnet/tree/master/pyglmnet.

Very usable for people working with binary and poisson observations.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests

1 participant