Logic behind importance ranking in Gradient Boosted Tree (GBT)
Hi,
Could you please explain what is the basis for ranking the importance of attributes in the GBT? For example, is it based on information gain or does it use a backward propagation/forward elimination approach such as the one in SelectAttribute operator does? I would appreciate your answers. I would appreciate even more if you could provide me with an article or a webpage (hopefully from rapidminer documentation) that explains the mathematical logic for ranking the attribute importance in Gradient Boosted Trees (GBT)
Thanks,
Best Answer
-
Hi,
this sounds like deep dive . Have a look at my favourite ML book, Hastie et. al: https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf
Page 367.
It seems to be just the average of all feature importances of the individual trees.
Best,
Martin
0
Answers
-
Hi,
this sounds like deep dive . Have a look at my favourite ML book, Hastie et. al: https://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLII_print12.pdf
Page 367.
It seems to be just the average of all feature importances of the individual trees.
Best,
Martin
0 -
Thanks Martin, I just checked the reference. You made my life easier. Thanks,
2