The setting of regularization and the type of activation function in Neural Network
The question is how to set regularization and its value, and how to change the type of activation function in the Neural Network module in Rapidminer.
If we could not change related settings, whether python code could be realised to achieve such kind of parameters' setting.
Any reply will be appreciated.
Find more posts tagged with
Sort by:
1 - 3 of
31

These control are not available in the simple Neural Net operator, but the Deep Learning operator has the ability to change the activation function, and the Deep Learning extension has even more controls. And you can always utilize any python model you want with the python scripting extension.
Hi @learner2020,
As said by @Telcontar120, "Deep Learning extension has even more controls" : You will need of this extension (to download from the marketplace) to set both settings :
- the Optimizer (Adam etc.)
- The activation function(s) of your layer(s) (Relu etc.)
The setting of the Optimizer is in the parameters of the Deep Learning operator :

The setting of the activation function is in the parameters of each layer types operator : For example here for a Fully connected Layer operator :

Regards,
Lionel
As said by @Telcontar120, "Deep Learning extension has even more controls" : You will need of this extension (to download from the marketplace) to set both settings :
- the Optimizer (Adam etc.)
- The activation function(s) of your layer(s) (Relu etc.)
The setting of the Optimizer is in the parameters of the Deep Learning operator :

The setting of the activation function is in the parameters of each layer types operator : For example here for a Fully connected Layer operator :

Regards,
Lionel
Sort by:
1 - 2 of
21
These control are not available in the simple Neural Net operator, but the Deep Learning operator has the ability to change the activation function, and the Deep Learning extension has even more controls. And you can always utilize any python model you want with the python scripting extension.
Hi @learner2020,
As said by @Telcontar120, "Deep Learning extension has even more controls" : You will need of this extension (to download from the marketplace) to set both settings :
- the Optimizer (Adam etc.)
- The activation function(s) of your layer(s) (Relu etc.)
The setting of the Optimizer is in the parameters of the Deep Learning operator :

The setting of the activation function is in the parameters of each layer types operator : For example here for a Fully connected Layer operator :

Regards,
Lionel
As said by @Telcontar120, "Deep Learning extension has even more controls" : You will need of this extension (to download from the marketplace) to set both settings :
- the Optimizer (Adam etc.)
- The activation function(s) of your layer(s) (Relu etc.)
The setting of the Optimizer is in the parameters of the Deep Learning operator :

The setting of the activation function is in the parameters of each layer types operator : For example here for a Fully connected Layer operator :

Regards,
Lionel