Explain Prediction inside Cross Validation : Error

varunm1
New Altair Community Member
Hello,
I am working on a process that needs explain prediction operator inside the testing process of a cross-validation operator. I use forward feature selection method inside cross-validation (training) that selects relevant features that support predictions, but when I provide those as input to the "tra" port of explain predictions and the "tes" of cross-validation to "tes" of explain predictions operator it is throwing an error.
I am looking to select attributes given by forward selection and apply the same attributes to the testing port of explain predictions as well so that it doesn't throw an error. Currently, all the features are going into the testing port of explain predictions but not into the training port as training is done on feature selected attributes.
The process is working fine in the absence of explain predictions. I attached data .ioo files which can be placed in the repository directly and process in this thread.
I worked with the same process using the automatic feature engineering operator and didn't face any issue, my understanding is that automatic feature engineering gives features instead of feature selected example set like forwarding selection. In that, I just used apply feature set operator to make train and test data going into the explain predictions operator have the same attributes.

@IngoRM
Thanks for your suggestions.
I am working on a process that needs explain prediction operator inside the testing process of a cross-validation operator. I use forward feature selection method inside cross-validation (training) that selects relevant features that support predictions, but when I provide those as input to the "tra" port of explain predictions and the "tes" of cross-validation to "tes" of explain predictions operator it is throwing an error.
I am looking to select attributes given by forward selection and apply the same attributes to the testing port of explain predictions as well so that it doesn't throw an error. Currently, all the features are going into the testing port of explain predictions but not into the training port as training is done on feature selected attributes.
The process is working fine in the absence of explain predictions. I attached data .ioo files which can be placed in the repository directly and process in this thread.
I worked with the same process using the automatic feature engineering operator and didn't face any issue, my understanding is that automatic feature engineering gives features instead of feature selected example set like forwarding selection. In that, I just used apply feature set operator to make train and test data going into the explain predictions operator have the same attributes.

@IngoRM
Thanks for your suggestions.
1
Best Answer
-
Hi,Another idea to use the "weights" produced by the feature selection and deliver them to the through port. In the testing part you could then use the operator "Select by Weights" to replicate the same attribute set.Hope this helps,
Ingo3
Answers
-
Did you try a Remember/Recall combination on the attributes that need to be passed through from the train to the test set? That could work.2
-
Hello Brian,
I am passing the training data exactly selected by the feature selection to explain predictions through the "thru" port of CV. My only issue is with testing data. The testing data in cross-validation is coming from the whole dataset, so it consists of all attributes, this is making explain predictions throw an error based on my understanding. I am just looking a way to filter attributes in test data based on attributes in train.
I will see what I can do with Remember/Recall.1 -
Hi,Another idea to use the "weights" produced by the feature selection and deliver them to the through port. In the testing part you could then use the operator "Select by Weights" to replicate the same attribute set.Hope this helps,
Ingo3 -
Thanks @IngoRM
I have one question. When I pass the dataset with all attributes to apply model it doesn't throw an error, but for explain predictions it is throwing an error as specified earlier. Does apply model filters out attributes that are not used in model building automatically and explain predictions is unable to do that?
Thanks for your suggestions.1 -
The pre-flight checks for Explain Prediction are a little more strict than those for Apply Model indeed. In the next version, we already made the type checks a bit less strict, but I will also look into the restrictions for supersets...
2