Sequential floating search
Danyo83
New Altair Community Member
Hey,
pure greedy approaches for feature selection have the disadvantage that once a feature is chosen or eliminated this decision cannot be revoked even if the learner would be better. Therefore one could use floating methods. For example one let choose the best random features (or GA based search) after 100 iterations. After that one combines sequential forward selection and backward elimination in a row and multiple times until there is no further improvement.
Would be a great tool for RM.
Daniel
pure greedy approaches for feature selection have the disadvantage that once a feature is chosen or eliminated this decision cannot be revoked even if the learner would be better. Therefore one could use floating methods. For example one let choose the best random features (or GA based search) after 100 iterations. After that one combines sequential forward selection and backward elimination in a row and multiple times until there is no further improvement.
Would be a great tool for RM.
Daniel
Tagged:
0
Answers
-
There is GA based feature selection: "Optimize Selection (Evolutionary)"?
Also there is entire feature selection extension? And P-Rules extension with different selection operators?
0