New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
predict after cross-validation using xgboost [question] #92
Comments
Yes, the xgb.cv does not return the model, but the cv history of the process. Since in cv we are training n models to evaluate the result. A normal use case of cv is to select parameters, so usually you use cv to find a good parameter, and use xgb.train to train the model on the entire dataset |
Ok, It's more clear now |
Hi, There is a parameter prediction=TRUE in xgb.cv, which returns the prediction of cv folds. But it is not clear from the document that for which nround, the predictions are returned? |
This is my first trial with xgboost (very fast!). But I'm a little bit confused .
In fact, I trained a model using xgb.cv as follows:
xgbmodel=xgb.cv(params=param, data=trainingdata, nrounds=100, nfold=5,showsd=T,metrics='logloss')
Now I want to predict with my test set but xgbmodel seems to be a logical value (TRUE in this case)
How could I predict after cv? Should I use xgb.train then?
HR
The text was updated successfully, but these errors were encountered: