Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for quantile regression as stated in the parameters section of the docs #36

Open
aaroncueckermann opened this issue Aug 17, 2021 · 1 comment
Labels
enhancement New feature or request

Comments

@aaroncueckermann
Copy link

~/miniconda3/lib/python3.8/site-packages/gpboost/engine.py in train(params, train_set, num_boost_round, gp_model, use_gp_model_for_validation, train_gp_model_cov_pars, valid_sets, valid_names, fobj, feval, init_model, feature_name, categorical_feature, early_stopping_rounds, evals_result, verbose_eval, learning_rates, keep_training_booster, callbacks)
276 # construct booster
277 try:
--> 278 booster = Booster(params=params, train_set=train_set, gp_model=gp_model)
279 if is_valid_contain_train:
280 booster.set_train_data_name(train_data_name)

~/miniconda3/lib/python3.8/site-packages/gpboost/basic.py in init(self, params, train_set, model_file, model_str, silent, gp_model)
2383 self.has_gp_model = True
2384 self.gp_model = gp_model
-> 2385 _safe_call(_LIB.LGBM_GPBoosterCreate(
2386 train_set.construct().handle,
2387 c_str(params_str),

~/miniconda3/lib/python3.8/site-packages/gpboost/basic.py in _safe_call(ret)
113 """
114 if ret != 0:
--> 115 raise GPBoostError(_LIB.LGBM_GetLastError().decode('utf-8'))
116
117

GPBoostError: The GPBoost algorithm can currently not be used for objective = quantile. If this is desired, contact the developer or open a GitHub issue.

It would be fantastic if quantile regression support could be added.

@fabsig
Copy link
Owner

fabsig commented Aug 26, 2021

Thank you for this suggestion.

The current implementation for non-Gaussian likelihoods (conditional data distribution) relies on the Laplace approximation, and three derivatives of the log-likelihood with respect to the parameter of interest (location parameter here) are needed. For quantile regression, the corresponding log-likelihood (log asymmetric Laplace density) is not differentiable at one point and the second and third derivatives are zero everywhere except this point. I.e., it seems that the Laplace approximation cannot be used. It is currently unclear to me what can be done here with how much effort. Maybe something can be done by approximating quantiles losses with another surrogate loss using ideas as in this article.

@fabsig fabsig added the enhancement New feature or request label Aug 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants