🎉Community Raffle - Win $25

An exclusive raffle opportunity for active members like you! Complete your profile, answer questions and get your first accepted badge to enter the raffle.
Join and Win

Stacking: probabilities instead of label?

User: "spitfire_ch"
New Altair Community Member
Updated by Jocelyn
Hi,

I was experimenting with stacking and noticed that the only output the base learners provide to the "stacking model learner" is the final label. Don't most base learners contain more information than just the final label? More specifically, couldn't one pass probabilities instead of the final label?

E.g. if the leaf of choice in a decision tree contains 3 positive and 2 negative cases, pass 3/5 instead of P. That way, each "guess" by the base models would automatically be weighted. If model 1 is sure about a result, the others is not (and predicts a different outcome), then the prediction of model A would be favored.

Best regards
Hanspeter

Find more posts tagged with