Hello community members,
I have a doubt about the normalization of the neural operator net.
According to the guide, the normalization of the data can be carried out automatically with the normalize option "This is an expert parameter. The Neural Net operator uses an usual sigmoid function as the activation function. Therefore, the value range of the attributes should be scaled to -1 and +1. This can be done through the normalize parameter. Normalization is performed before learning. Although it increases runtime but it is necessary in most cases."
My question is: Why is it necessary to scale the values between -1 and +1? Can we choose to scale them between 0 and 1 or another normalization type?
Thank you very much for you answers