Neural net and normalization

GonzaloAD
GonzaloAD New Altair Community Member
edited November 5 in Community Q&A

Hello community members,

I have a doubt about the normalization of the neural operator net.

 

According to the guide, the normalization of the data can be carried out automatically with the normalize option  "This is an expert parameter. The Neural Net operator uses an usual sigmoid function as the activation function. Therefore, the value range of the attributes should be scaled to -1 and +1. This can be done through the normalize parameter. Normalization is performed before learning. Although it increases runtime but it is necessary in most cases."

 

My question is: Why is it necessary to scale the values between -1 and +1? Can we choose to scale them between 0 and 1 or another normalization type?

 

Thank you very much for you answers

Best Answer

  • rfuentealba
    rfuentealba New Altair Community Member
    Answer ✓

    Hi @GonzaloAD,

     

    An Artificial Neural Network (ANN) is a collection of units of information, arranged in series of layers. You have three types of units: input units, hidden units and output units. Each connection between two units is called a weight, and that weight can be positive or negative, depending on if one unit "excites" or "inhibits" another one.

     

    If you use different scales for your variables, your ANN algorithm will probably not notice a correlation between these variables. Normalizing your inputs prevents such differences to affect the weights. If you want to define your upper and lower values in +32768 and -32767 you can, but you have to use the "Normalize" operator before you operate on your ANN.

    Hope this helps.

     

Answers

  • MartinLiebig
    MartinLiebig
    Altair Employee

    Hi,

    it's i think not necessary but it can make your model better.  I recommend to use an explicit normalize operator up front. This way you can also change your normalization scheme.

     

    Best,

    Martin

  • rfuentealba
    rfuentealba New Altair Community Member
    Answer ✓

    Hi @GonzaloAD,

     

    An Artificial Neural Network (ANN) is a collection of units of information, arranged in series of layers. You have three types of units: input units, hidden units and output units. Each connection between two units is called a weight, and that weight can be positive or negative, depending on if one unit "excites" or "inhibits" another one.

     

    If you use different scales for your variables, your ANN algorithm will probably not notice a correlation between these variables. Normalizing your inputs prevents such differences to affect the weights. If you want to define your upper and lower values in +32768 and -32767 you can, but you have to use the "Normalize" operator before you operate on your ANN.

    Hope this helps.

     

  • GonzaloAD
    GonzaloAD New Altair Community Member

    Perfect,


    as I supposed, it's just the default option of the operator. As I had seen, another normalization can be used, always using previously the operator to normalize.

     

    Thank you very much for your answer