🎉Community Raffle - Win $25

An exclusive raffle opportunity for active members like you! Complete your profile, answer questions and get your first accepted badge to enter the raffle.
Join and Win

forward selection and backward elimination

User: "AD2019"
New Altair Community Member
Updated by Jocelyn
I ran a multiple regression model on a dataset having 15 variables first using the "forward selection" nested operator, and then using the "backward elimination" nested operator.  I got dramatically different models.  the first had 3 independent variables, the second had 8  IVs.  why such a bid difference.  I realize the serial elimination or addition of IVs may yield local optima, but is it common to get such wildly different "optimal" models for the same dataset?  How can training yield such dramatically different trained models?
thanks in advance,
AD

Find more posts tagged with