Information gain and numerical attributes
IngoRM
New Altair Community Member
Original message from SourceForge forum at http://sourceforge.net/forum/forum.php?thread_id=2043728&;forum_id=390413
Hi,
how does RapidMiner handle numerical attributes and information gain calculation for feature seletion? Is every occuring value used or does RM calculate several "bins"?
Answer by Ingo Mierswa:
Hello,
do you refer to the InfoGainWeighting operator or the information gain calculation inside of a decision tree learner?
> Is every occuring value used or does RM calculate several "bins"?
Both is possible. If you discretize the values first with one of the discretization operators, these bins are used. If not, RM tries all possible split points.
Cheers,
Ingo
Answer by topic starter:
Hi,
I was refering to the InfoGainWeighting operator which is used for feature selection.
Hi,
how does RapidMiner handle numerical attributes and information gain calculation for feature seletion? Is every occuring value used or does RM calculate several "bins"?
Answer by Ingo Mierswa:
Hello,
do you refer to the InfoGainWeighting operator or the information gain calculation inside of a decision tree learner?
> Is every occuring value used or does RM calculate several "bins"?
Both is possible. If you discretize the values first with one of the discretization operators, these bins are used. If not, RM tries all possible split points.
Cheers,
Ingo
Answer by topic starter:
Hi,
I was refering to the InfoGainWeighting operator which is used for feature selection.
Tagged:
0
Answers
-
Hi again,
Same applies here: if the attributes are already nominal or if they are discretized before, the usual information gain is used. If not, this operator tries all possible split points between two neighbored numbers and selects the split point with the highest gain and delivers the corresponding value.
I was refering to the InfoGainWeighting operator which is used for feature selection.
Cheers,
Ingo0