Different
mb223223
New Altair Community Member
Hi all
When I run "Deep learning" for a data set, I find different different "r" and "RMSE" of prediction in each separate run!
However this is not the case with "NN".
I want to know why "Deep learning" give different error estimators in each separate run!
When I run "Deep learning" for a data set, I find different different "r" and "RMSE" of prediction in each separate run!
However this is not the case with "NN".
I want to know why "Deep learning" give different error estimators in each separate run!
Tagged:
0
Best Answer
-
Hello @mb223223
If you are asking about deep learning operator (not the extension), yes results change every time unless you select a reproducible option and a local random seed in the parameters. The reason for that is the weight initialization in every run and also the randomness that might change in the model during the training process.This the main reason for that. Even a simple neural net might change slightly if we run on different machines, for reproducible results, set local seed.Each compute node trains a copy of the global model parameters on its local data with multi-threading (asynchronously), and contributes periodically to the global model via model averaging across the network.
Hope this helps1
Answers
-
Hello @mb223223
If you are asking about deep learning operator (not the extension), yes results change every time unless you select a reproducible option and a local random seed in the parameters. The reason for that is the weight initialization in every run and also the randomness that might change in the model during the training process.This the main reason for that. Even a simple neural net might change slightly if we run on different machines, for reproducible results, set local seed.Each compute node trains a copy of the global model parameters on its local data with multi-threading (asynchronously), and contributes periodically to the global model via model averaging across the network.
Hope this helps1 -
Hi Dear Varun
Thank you again for the reply on my post. Best
1