Neural Net Loop Parameters
btibert
New Altair Community Member
Is it not possible to use the Loop Parameters Operator to test various scenarios with respect to the hidden layers of the network? We can control this by hand, so I would have assumed this would have been an option. For context, I want to try to show my class the concept of various structures and how to assess performance across that set of tests, with the ultimate goal of discussing overfitting.
1
Best Answer
-
Hello @btibert
That is a good question. From my understanding, the specification of hidden layers has two components that should be specified, 1. A number of hidden layers and 2. The size of hidden layers. This is the same case with the loop parameter as well as optimize parameters. Even deep learning operators don't support these parameters.
Maybe some computational things. Not sure though.1
Answers
-
Hello @btibert
That is a good question. From my understanding, the specification of hidden layers has two components that should be specified, 1. A number of hidden layers and 2. The size of hidden layers. This is the same case with the loop parameter as well as optimize parameters. Even deep learning operators don't support these parameters.
Maybe some computational things. Not sure though.1 -
Sure, from a teaching point of view, it would have been nice to specify these to highlight how the structure of the network changes relative to performance. And for the layers, I think we just have to specify an arbitrary name and the number of neurons, so the two inputs that would have been nice to iterate over are number of hidden layers (1 - 3) and neurons (2-7) and do those combinations. Regardless, just wanted to make sure I wasn't missing something.2