Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Robustness and quality of life improvements to Shared Sparsity contribution. #58

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

kgreenewald
Copy link

No description provided.

Updates to improve robustness and quality-of-life:
-Addition of a max_norm constraint to avoid overflow/nonsparse outcomes given the nonconvex regularizer (involves a minor change to the optimization algorithm). This was a parameter that the theory had included, but we had omitted this from the original version since it only seems to matter in extreme circumstances where the hyperparameters are poorly chosen. There is also an auto option here for easy use, as well as none to ignore.
-Addition of an auto option for the mcp_alpha parameter, since this may be somewhat mysterious to some.
-Addition of an include_bias option defaulted to True. This subtracts off the mean of each group of covariates and outcomes (something we typically did outside the module), which is equivalent to adding an unregularized bias term to the regressions.
-The replacement of the auto option for mcp_lambda with an option proportional_lambda, which, if set to True, interprets the value of mcp_lambda as a constant of proportionality to the previous auto option. This is because I found in practice, the coefficient of the auto option pretty much always needs to be tuned, but that the analytic expression it uses was extremely helpful otherwise.
edit parameter description
@CLAassistant
Copy link

CLAassistant commented May 31, 2023

CLA assistant check
All committers have signed the CLA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants