Sequential floating search

Danyo83
Danyo83 New Altair Community Member
edited November 5 in Community Q&A
Hey,

pure greedy approaches for feature selection have the disadvantage that once a feature is chosen or eliminated this decision cannot be revoked even if the learner would be better. Therefore one could use floating methods. For example one let choose the best random features (or GA based search) after 100 iterations. After that one combines sequential forward selection and backward elimination in a row and multiple times until there is no further improvement.
Would be a great tool for RM.

Daniel

Tagged:

Answers

  • wessel
    wessel New Altair Community Member
    There is GA based feature selection: "Optimize Selection (Evolutionary)"?

    Also there is entire feature selection extension? And P-Rules extension with different selection operators?