Hi everyone!
My question is the following:
I'm trying to build some different classification models with different algorithms and techniques and compare the results obtained.
I already built a model using Random Forest and using Bagging technique. In this case, since I had many attributes in my dataset and most of them were almost useless wrt to my target variable classification, I performed a very simple Features Selection by attributes weights. I read in literature that with Bagging is better to perform features selection at each bootstrap.
But when using an algorithm as Gradient Boosted Trees which uses Boosting technique to select features subsets which minimize misclassification error, does it make sense to perform a FS before training the model?
I read that some boosted algorithms already contain feature selection and some do not.
Hope someone with more knowledge and experience can help me, thank you in advance!