stacking operator

Thiru
New Altair Community Member
dear all, For my classification problem using supervised learning, Im trying to use stacking operator.
The tutorial for this operator shows two layer, i.e. one base learner layer + stacking learner.
My query is , can we have more than two layers...?
For example, is it possible to have 3 layers, i.e. two layers of base learner , before I fed to the stacking learner?
If so, how to configure that? kindly let me know. thanks.
thanx
thiru
The tutorial for this operator shows two layer, i.e. one base learner layer + stacking learner.
My query is , can we have more than two layers...?
For example, is it possible to have 3 layers, i.e. two layers of base learner , before I fed to the stacking learner?
If so, how to configure that? kindly let me know. thanks.
thanx
thiru
Tagged:
0
Best Answer
-
Can't you just put one stacking operator inside another? So your "base learner" of the outer stack would be another stacking operator of its own?
Or in theory you could use any other ensemble operator as well. Basically anything that gives you a model output which is all that the stacking operator requires as an input on the left side.0
Answers
-
Can't you just put one stacking operator inside another? So your "base learner" of the outer stack would be another stacking operator of its own?
Or in theory you could use any other ensemble operator as well. Basically anything that gives you a model output which is all that the stacking operator requires as an input on the left side.0 -
thanks for your reply. yes , it works.
1. Is there any guideline how to combine the operators.
Using Gradient boost alone gives accuracy 75.75%. I want to take it beyond 95% using stacking.
(ofcourse I'll look into precision/recall).
I tried many stacking combinations, i couldnt make it beyond 73%.
2. Any theoretical/white paper reference / rapidminer resource/ book on how to choose the combination of base learners , stack learner for the maximum classification performance.
3. will it be useful to use non-neural net operator as base learner & neural net (deep learning ) as stacking learner?
i tried this combination, in my case, it reduces the performance to 55%.
thank you
thiru0