GELU Activation function missing in Deep Learning extension. How to implement?
Hey, I hope you are doing well.
I want to implement a specific Deep learning architecture in Rapidminer. For that I need the for example GELU activation function in my fully-connected layer, which is not present there. There are also other operations which I have to implement in this architecture, like skip-connections.
I tried to execute a custom Python script at the specific places, which did not work for me, because, you can't have a Layer Architecture as input for a script file. That's why Rapidminer throws an error there.
So I wanted to ask, what my options are? I thought about implementing the modal fully in Python, but I like Rapidminer so I want to use it.
I hope I could articulate my question. Feel free to ask, if something was unclear.
Best regarcs,
Enes