"How to see the variable importance of a neural net similar to deep learning?"
Hi everyone
Pls help me. I am working on rapidminer. I am using Deep learning and neutral net to build regression model. In deep learning operator, the output has weight - this port deliver the weight of the attribute with respect to the label attribute, but in neutral net we dont have it. And i want to know how to show the weight of the attribute in neutral net.
tks
An
| |||
|