Implement Sequential Forward Floating Search
Salo
New Altair Community Member
I am learning how to use rapidminer 5.3 and wanted to try a sequential floating selection algorithm in it. However it's not on the Rapidminer so I was trying to think of a workaround on how to implement it on rapidminer.
The steps of it are this:
Step 1: Inclusion
Use the basic SFS method to select the most significant feature with respect to X and include it in X.
Stop if d features have been selected, otherwise go to step 2
Step 2: Conditional exclusion.
Find the least significant feature k in X. If it is the feature just added, then keep it and return to step 1. Otherwise, exclude the feature k. Note that X is now better than it was before step 1. Continue to step 3
Step 3: Continuation of conditional exclusion
Again find the least significant feature in X. If its removal will (a) leave X with at least 2 features, and (b) the value of J(X)is greater than the criterion value of the best feature subset of that size found so far, then remove it and repeat step 3. When these two conditions cease to be satisfied, return to step 1
I Could implement the Forward Selection and then use the original data set for the conditional exclusion (Backward Selection) step of the algorithm and finally join the most significant feature from the forward search with the dataset obtained from the conditional exclusion. That would I think cover the first two steps
But then I have no idea how to tackle STEP 3, any ideas?
The steps of it are this:
Step 1: Inclusion
Use the basic SFS method to select the most significant feature with respect to X and include it in X.
Stop if d features have been selected, otherwise go to step 2
Step 2: Conditional exclusion.
Find the least significant feature k in X. If it is the feature just added, then keep it and return to step 1. Otherwise, exclude the feature k. Note that X is now better than it was before step 1. Continue to step 3
Step 3: Continuation of conditional exclusion
Again find the least significant feature in X. If its removal will (a) leave X with at least 2 features, and (b) the value of J(X)is greater than the criterion value of the best feature subset of that size found so far, then remove it and repeat step 3. When these two conditions cease to be satisfied, return to step 1
I Could implement the Forward Selection and then use the original data set for the conditional exclusion (Backward Selection) step of the algorithm and finally join the most significant feature from the forward search with the dataset obtained from the conditional exclusion. That would I think cover the first two steps
But then I have no idea how to tackle STEP 3, any ideas?
Tagged:
0