Community & Support
Learn
Marketplace
Discussions
Categories
Discussions
General
Platform
Academic
Partner
Regional
User Groups
Documentation
Events
Altair Exchange
Share or Download Projects
Resources
News & Instructions
Programs
YouTube
Employee Resources
This tab can be seen by employees only. Please do not share these resources externally.
Groups
Join a User Group
Support
Altair RISE
A program to recognize and reward our most engaged community members
Nominate Yourself Now!
Home
Discussions
Community Q&A
Explaining XGBoost - Explain Predictions vs xgb.importance
sherlock
Need to interpret a complex XGBoost model, I have it in Rapidminder as well as in R using the xgb package. xgb has an inbuilt function for importance, the feature importance however divers from the results of the Rapidminer operator "Explain Predictions". Is there a detailed explanation of how "Explain Predictions" work? Which one would you recommend?
Find more posts tagged with
AI Studio
Accepted answers
All comments
There are no accepted answers yet
Quick Links
All Categories
Recent Discussions
Activity
Unanswered
日本語 (Japanese)
한국어(Korean)
Groups