Can someone please help with more details on how the "greedy" feature selection in
Linear Regression operator works?
In the
Optimize Selection operator, the two greedy algorithms (forward selection and backward elimination) are clearly specified. However, in the case of the
Linear Regression operator, it is not yet clear which algorithm/approach is used in case of the built-in "greedy" feature selection dropdown option for the
Linear Regression operator.
I could not find any elaboration on this in the
documentation for Linear Regression operator. Also, I checked the
source code for this option. According to it -
This class implements an internal forward selection for the linear regression. It uses the Akaike Criterion that is maximized roundwise. Each round the attribute minimizing the akaike criterion is deselected.
I am trying to understand the exact model selection process happening here and the description and the code is a bit unclear to me.
1. The above statement says internal forward selection. But the later sentence "..attribute minimizing the akaike criterion is deselected." suggests backward elimination. Which one is it? Can someone elaborate on this?
2. How is AIC criterion computed here? A couple of sources (link1 and link2) suggest AIC for linear regression as n x ln(SSE / n) + 2 (k + 1) where n is the number of observations and k is the number of predictors (one is added for the intercept term). The source code states -
akaike = (numberOfExamples - numberOfUsedAttributes) + 2 * numberOfUsedAttributes;
This is a bit confusing as well. Any insights would be appreciated.