Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-task hyperparameter tuning using bayesian optimization #523

Open
Huma-Shehwana opened this issue Feb 7, 2024 · 0 comments
Open

Multi-task hyperparameter tuning using bayesian optimization #523

Huma-Shehwana opened this issue Feb 7, 2024 · 0 comments

Comments

@Huma-Shehwana
Copy link

Hello, I am attempting to conduct multi-task hyperparameter tuning using Bayesian optimization for the XGBoost algorithm. I believe my scenario qualifies as a multi-task problem because my dataset comprises four variables, and I aim to treat each variable as a target variable while considering the other three variables as features. Consequently, I plan to train the XGBoost model four times, each time using a different target variable. My objective is to discover a set of hyperparameters that optimizes performance across all four tasks simultaneously. I understand that this approach differs from multi-objective hyperparameter tuning, where various performance metrics are evaluated. I am curious whether multi-task hyperparameter tuning is feasible using the mlrMBO package. If so, could you please direct me to any related tutorials or resources for guidance

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant