Explaining XGBoost - Explain Predictions vs xgb.importance

sherlock
sherlock New Altair Community Member
edited November 2024 in Community Q&A
Need to interpret a complex XGBoost model, I have it in Rapidminder as well as in R using the xgb package. xgb has an inbuilt function for importance, the feature importance however divers from the results of the Rapidminer operator "Explain Predictions". Is there a detailed explanation of how "Explain Predictions" work? Which one would you recommend?
Tagged: