Neural Net Loop Parameters

btibert
btibert New Altair Community Member
edited November 2024 in Community Q&A
Is it not possible to use the Loop Parameters Operator to test various scenarios with respect to the hidden layers of the network?  We can control this by hand, so I would have assumed this would have been an option.  For context, I want to try to show my class the concept of various structures and how to assess performance across that set of tests, with the ultimate goal of discussing overfitting.


Best Answer

  • varunm1
    varunm1 New Altair Community Member
    Answer ✓
    Hello @btibert

    That is a good question. From my understanding, the specification of hidden layers has two components that should be specified, 1. A number of hidden layers and 2. The size of hidden layers. This is the same case with the loop parameter as well as optimize parameters. Even deep learning operators don't support these parameters.

    Maybe some computational things. Not sure though.

Answers

  • varunm1
    varunm1 New Altair Community Member
    Answer ✓
    Hello @btibert

    That is a good question. From my understanding, the specification of hidden layers has two components that should be specified, 1. A number of hidden layers and 2. The size of hidden layers. This is the same case with the loop parameter as well as optimize parameters. Even deep learning operators don't support these parameters.

    Maybe some computational things. Not sure though.
  • btibert
    btibert New Altair Community Member
    Sure, from a teaching point of view, it would have been nice to specify these to highlight how the structure of the network changes relative to performance.  And for the layers, I think we just have to specify an arbitrary name and the number of neurons, so the two inputs that would have been nice to iterate over are number of hidden layers (1 - 3) and neurons (2-7) and do those combinations.  Regardless, just wanted to make sure I wasn't missing something.