Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

105 parameter tuning #131

Open
wants to merge 26 commits into
base: dev
Choose a base branch
from
Open

105 parameter tuning #131

wants to merge 26 commits into from

Conversation

ChloeYou
Copy link
Contributor

This is a demo vignette to show how I approached cross validation and hyperparameter tuning, so it doesn't need to be merged into the main branch. The current Epipredict workflow doesn't seem to directly support CV and parameter tuning, good news is that they can still be achieved with a few extra lines of code. The biggest issue seems to come from the recipes step - the model doesn't recognize the new variables created in recipes and causes some issues with subsequent steps in CV. The way to work around it is it prep and bake the data before passing it into the workflow for CV & tuning.

@dajmcdon Would you be able to take a look when you're available? Let me know if you want to discuss more on it. Thank you!

@ChloeYou ChloeYou linked an issue Aug 30, 2022 that may be closed by this pull request
@ChloeYou ChloeYou marked this pull request as ready for review August 30, 2022 04:32
Copy link
Contributor

@dajmcdon dajmcdon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

test comment

@dajmcdon dajmcdon changed the base branch from frosting to main October 27, 2022 19:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

parsnip and glmnet for model parameter tuning
3 participants