🎉Community Raffle - Win $25

An exclusive raffle opportunity for active members like you! Complete your profile, answer questions and get your first accepted badge to enter the raffle.
Join and Win

"How to see the variable importance of a neural net similar to deep learning?"

User: "annata"
Altair Community Member
Updated by Jocelyn

Hi everyone

Pls help me. I am working on rapidminer. I am using Deep learning and neutral net to build regression model. In deep learning operator, the output has weight - this port deliver the weight of the attribute with respect to the label attribute, but in neutral net we dont have it. And i want to know how to show the weight of the attribute in neutral net.

image.png

tks

An