Find more posts tagged with
Sort by:
1 - 2 of
21
Hi Balázs
thanks for responding,
in this case I try to increase the level of accuracy from previous research. in a previous study my friend used the naive bayes algorithm with feature weighting (PSO). And the total accuracy is quite high, so I try to use feature generation to see if the results are better or not.
for the outer validation you suggested, I still don't understand how to implement it, but I'm trying to find a solution. Sorry for my bad english
Sort by:
1 - 1 of
11
Hi,
what is your goal with this process? Feature weighting/optimization (the PSO operator) is already being done by Random Forest. The feature generation can help if you suspect that your data contain "hidden" combinations of attributes that you don't know of.
The best way to validate the process that builds your models is to put everything into an outer validation, e. g. a Cross Validation.
Inside the training of this outer validation process you can try further optimizations like feature generation, feature selection (but with tree-based methods that are already about feature selection that won't help a lot) or parameter optimization. Inside all optimization operators you will need another (inner) validation process.
Regards,
Balázs
what is your goal with this process? Feature weighting/optimization (the PSO operator) is already being done by Random Forest. The feature generation can help if you suspect that your data contain "hidden" combinations of attributes that you don't know of.
The best way to validate the process that builds your models is to put everything into an outer validation, e. g. a Cross Validation.
Inside the training of this outer validation process you can try further optimizations like feature generation, feature selection (but with tree-based methods that are already about feature selection that won't help a lot) or parameter optimization. Inside all optimization operators you will need another (inner) validation process.
Regards,
Balázs
what is your goal with this process? Feature weighting/optimization (the PSO operator) is already being done by Random Forest. The feature generation can help if you suspect that your data contain "hidden" combinations of attributes that you don't know of.
The best way to validate the process that builds your models is to put everything into an outer validation, e. g. a Cross Validation.
Inside the training of this outer validation process you can try further optimizations like feature generation, feature selection (but with tree-based methods that are already about feature selection that won't help a lot) or parameter optimization. Inside all optimization operators you will need another (inner) validation process.
Regards,
Balázs