Parameter Optimizer Problem
hagen85
New Altair Community Member
Hi There,
I am using an evolutionary parameter optimizer to determine the best parameters for my neural network. The neural network is embedded in a sliding window X-Validation with cumulative learning. After the process is finished I get the parameter set and a performance of 67,33 %. When I actually apply the parameters I only get 65,85 %, how is that possible? Isn´t that supposed to be the same?
Regards
Hagen
I am using an evolutionary parameter optimizer to determine the best parameters for my neural network. The neural network is embedded in a sliding window X-Validation with cumulative learning. After the process is finished I get the parameter set and a performance of 67,33 %. When I actually apply the parameters I only get 65,85 %, how is that possible? Isn´t that supposed to be the same?
Regards
Hagen
Tagged:
0
Answers
-
Hi,
it is hard to tell anything without knowing your process (please see http://rapid-i.com/rapidforum/index.php/topic,4654.0.html ). However, yes, it is possible that you get different results, since the exerimental performance is always an approximation of the true performance. So depending on your setup, it is perfectly possible that you get (usually small) differences in the performance.
Best, Marius0