🎉Community Raffle - Win $25

An exclusive raffle opportunity for active members like you! Complete your profile, answer questions and get your first accepted badge to enter the raffle.
Join and Win

Gradient Boosted Tree and performance

User: "Barborka"
New Altair Community Member
Updated by Jocelyn
Dear community,

I want to understand my GBT algorithm. I trained it, validated it on new data with quite a good result. Now, I would like to understand the model to find out, which attributes were the most decisive ones, but here I fail. For example, my Tree 1 is described as

ch1 in {1009351207,1047831207,... (46 more)}: 0.013 {}

ch1 not in {1009351207,1047831207,... (46 more)}

|   ch1 in {1009351207,1000751092,... (49 more)}: -0.009 {}

|   ch1 not in {1009351207,1000751092,... (49 more)}: -0.027 {}


Could you please, explain, where can I find these 46 more atributes? Or 49 more attributes?


Thanks a lot.


Find more posts tagged with

Sort by:
1 - 1 of 11
    User: "BalazsBaranyRM"
    New Altair Community Member
    Accepted Answer
    Hi @Barborka,

    if you're looking at the description of one tree and it only contains ch1, then it only considers ch1. Other trees might consider different attributes. The weights output of the entire model shows the summary - single trees are not that relevant.

    I couldn't find a way to extract the whole list of values going into the rules. There are some promising operators like Tree to Rules and DecisionTree to ExampleSet (in the Converters extension) but these don't work with GBT, only single trees.

    Regards,
    Balázs