Home
Discussions
Community Q&A
Explaining XGBoost - Explain Predictions vs xgb.importance
sherlock
Need to interpret a complex XGBoost model, I have it in Rapidminder as well as in R using the xgb package. xgb has an inbuilt function for importance, the feature importance however divers from the results of the Rapidminer operator "Explain Predictions". Is there a detailed explanation of how "Explain Predictions" work? Which one would you recommend?
Find more posts tagged with
AI Studio
Accepted answers
All comments
There are no accepted answers yet
Quick Links
All Categories
Recent Discussions
Activity
Unanswered
日本語 (Japanese)
한국어(Korean)