Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

predict after cross-validation using xgboost [question] #92

Closed
RanaivosonHerimanitra opened this issue Nov 1, 2014 · 3 comments
Closed

Comments

@RanaivosonHerimanitra
Copy link

This is my first trial with xgboost (very fast!). But I'm a little bit confused .
In fact, I trained a model using xgb.cv as follows:
xgbmodel=xgb.cv(params=param, data=trainingdata, nrounds=100, nfold=5,showsd=T,metrics='logloss')
Now I want to predict with my test set but xgbmodel seems to be a logical value (TRUE in this case)
How could I predict after cv? Should I use xgb.train then?
HR

@tqchen
Copy link
Member

tqchen commented Nov 1, 2014

Yes, the xgb.cv does not return the model, but the cv history of the process. Since in cv we are training n models to evaluate the result.

A normal use case of cv is to select parameters, so usually you use cv to find a good parameter, and use xgb.train to train the model on the entire dataset

@RanaivosonHerimanitra
Copy link
Author

Ok, It's more clear now

@vikasnitk85
Copy link

Hi,

There is a parameter prediction=TRUE in xgb.cv, which returns the prediction of cv folds. But it is not clear from the document that for which nround, the predictions are returned?

hcho3 pushed a commit to hcho3/xgboost that referenced this issue May 9, 2018
@lock lock bot locked as resolved and limited conversation to collaborators Oct 26, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants