Different results outside and inside optimize parameter operator
Dear RapidMiner community,
I have a problem using the optimize parameter operator.
I have a dataset (doesn't really matter which one) and I want to optimize say a neural net learning rate.
The problem that I found was, that the optimized parameter determined by the optimizer would give different results if used outside the optmizer.
To give a concrete example, I use the optimize parameter operator for the learning rate of a neural net, using x-validation.
Let's say, the optimizer finds that a learning rate of 0.6 is optimial, and my performance vector gives an f-measure of 0.7
When I use an x-validation operator on the same data, with the same settings and local seed but without the optimizer, with the learning rate of 0.6 that my optimizer found, I get a different performance!
How can this be? Any suggestions? I even unchecked the shuffle box of the neural net.
Best regards,
Gabriel
I have a problem using the optimize parameter operator.
I have a dataset (doesn't really matter which one) and I want to optimize say a neural net learning rate.
The problem that I found was, that the optimized parameter determined by the optimizer would give different results if used outside the optmizer.
To give a concrete example, I use the optimize parameter operator for the learning rate of a neural net, using x-validation.
Let's say, the optimizer finds that a learning rate of 0.6 is optimial, and my performance vector gives an f-measure of 0.7
When I use an x-validation operator on the same data, with the same settings and local seed but without the optimizer, with the learning rate of 0.6 that my optimizer found, I get a different performance!
How can this be? Any suggestions? I even unchecked the shuffle box of the neural net.
Best regards,
Gabriel