"Decision Tree Optimization and Accuracy"
Hi,
I'm trying to use a decision tree which is nested inside the Optimize Parameters (Grid), focusing on Max. Depth and Min. Gain.
Reconstructing the decision tree using the results obtained above, but without the Optimize Parameters (Grid), the accuracy is lower now.
Why is that so?
Best Answer
-
If you are using cross-validation without a local random seed set then every time you close and re-open that process in RapidMiner you can get a different result even if you don't make any other changes. So I suspect that could be the issue.
0
Answers
-
Hi,
what performances are you comparing? The X-Val performances from within the optimization with an X-Val result from without?
Are the performances comparable w.r.t their std_devs?
~Martin0 -
If you are using cross-validation without a local random seed set then every time you close and re-open that process in RapidMiner you can get a different result even if you don't make any other changes. So I suspect that could be the issue.
0 -
Thanks for pointing me to the right direction.
The random seed was the issue, resulting in the data sets being different after the split operator.
0