[SOLVED] Creating ensemble methods (bagging)
laplanto
New Altair Community Member
Hi, I am trying to create a hybrid classificator using two different classification algorithms. I want to use bagging, so idea would be to split dataset into multiple datasets, classifie with one kin of classifiers and classify results with another classifier.
I have dataset. I use bagging control in rapidMiner and K-NN inside it. I manage to get classification results from each of k-nn. How can i collect those results and give them as a dataset to the next classifier (probably random forest) ?
I have dataset. I use bagging control in rapidMiner and K-NN inside it. I manage to get classification results from each of k-nn. How can i collect those results and give them as a dataset to the next classifier (probably random forest) ?
Tagged:
0
Answers
-
Hi,
if I understand you correctly, what you want to do is commonly referred to as "stacking". We have a corresponding Stacking operator in RapidMiner. Please have a look at its documentation to see if it fits your needs.
Best regards,
Marius0 -
If i understand well, stacking is exactly what i need. But i have problems using it.
First of all, i need to test this hybrid classificator using 10, 100, 200 K-NN in the first level and i don't know how to do it. I think i need to use split and bagging operators.
Secondly i need to use svm in the second level, but i get error "SVM does not have sufficient capabilities for the given data: binomial attributes not supported".
0 -
Ok, i managed to do this:
The only problem is, that with SVM it does not work. I get error "The operator SVM does not have sufficient capabilities for the given adta set: binomial attributes not supported".
If i change SVM with K-NN it works perfectly. But i need SVM to be there.
**********************************************************************************
EDIT
I found a workaround to my problem. I changed K-NN and SVM places and it works ok. Of course, for future works, it would be nice to know how to solve this problem.0 -
This may be an old thread but in case anyone else stumbles upon it like I did, I thought I should answer what I can.
The k-NN model creates a new binomial prediction variable that is appended to your dataset. SVM cannot handle binominal attributes and that is why it throws that error. The algorithms placed in the Base Learner window (left side) of the Stacking operator will always create this new binomial attribute so the algorithm in the Stacking Model Learner window (right) has to have binomial attribute capabilities.
3 -
Hi KellyM: Thanks so much for your post!.
Best wishes, Michael MArtin
2