🎉Community Raffle - Win $25

An exclusive raffle opportunity for active members like you! Complete your profile, answer questions and get your first accepted badge to enter the raffle.
Join and Win

Gradient Boosted Tree don's give the final prediction

User: "sbrnae"
New Altair Community Member
Updated by Jocelyn
Hello Rapidminer Community !

I want to ask regarding Gradient Boosted model that i used for my study on predicting corporate default risk. My dependent variable is default and non default and i use number 1 as default and 0 as non default. I already setup the data type as binominal. After i call the related operators such as select attributes, set roles and cross validation, all tree at the end of the result don't show the branches either it will become 1 or 0 as i assigned before. Below i share one of the Gradient Booted models 


So, my question is how does this happened and is there any way to solve this problem ? I really hope someone can help me to solve this problem because its important for my study since the due is so near. I'm really open to anyone to answer my question. Thank you in advance. 
Sort by:
1 - 1 of 11
    Hi,
    I think you need to check how a GBT works to understand this. The leaves here are just not the label you search for.

    In each iteration of the tree, the tree tries to predict:
    real_value - sum previous_predictions.

    where real value 0,1. So if the first tree predicts in a 0.7, where the real value is 1, the second tree tries to predict 0.3 and so on.*

    Please check this guide: https://community.rapidminer.com/discussion/51258/a-practical-guide-to-gradient-boosted-trees-part-i-regression . It should help (even though its on regression).

    Best,
    Martin

    *: The reality is a bit different, because it depends on the chosen lost measure. Also note that there is somewhat a 0-th run, where you just predict the average of the classes. Thus you do not start with 0/1.