Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH add support for L1 + L2 regularization in SparseLogisticRegression #231

Open
mathurinm opened this issue Mar 25, 2024 · 1 comment
Open

Comments

@mathurinm
Copy link
Collaborator

Currently we only support L1 in logreg: https://contrib.scikit-learn.org/skglm/generated/skglm.SparseLogisticRegression.html

We could introduce a second regularization parameter corresponding to a squared L2 regularisation, like ElasticNet is to Lasso

@PascalCarrivain would you give it a try ?

@PascalCarrivain
Copy link
Contributor

Ok, why not.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants