Decission Trees and Attirbute Set Selection

chaosbringer
chaosbringer New Altair Community Member
edited November 5 in Community Q&A
Hi,
i have a probably stupid question.
Is it reasonable to use the Attribute Selection Operator (e.g. Evolutionary Optimization) with decission trees?
I ask, because during the decision tree induction the attribute set is automatically reduced due to pruning and information-criteria based node-spliting, and this process automatically eliminates unnecessary attributes. Am i right?
If i am, would it not make sense to use a decission tree for attribute selection and than use the attributes used at the tree nodes for e.g. neural net learning?

Thank you very much.
Tagged:

Answers

  • haddock
    haddock New Altair Community Member
    Hi there,
    I ask, because during the decision tree induction the attribute set is automatically reduced due to pruning and information-criteria based node-spliting, and this process automatically eliminates unnecessary attributes. Am i right?
    In a word, no! What happens is that the most entropy reducing attribute gets used, wherever you are in the tree. The converse is that the useless stuff remains to be used later. In theory this means that you can make decisions without having to ask silly questions, and in practise it means that you can overcome the noise that junk attributes generate. The flip-side is that you can end up carrying a load of junk through the process, it won't affect the result, but it slows things down.

    This latter consideration might provide a motivation for reducing the attribute set by pre-processing, but you have to be careful not to introduce junk in return, like user assumptions about what matters, even indirectly through parameter bias.

    O for a lump of perfect Green...

  • chaosbringer
    chaosbringer New Altair Community Member
    Thank you for your answer.