How to Change Prediction Graphs in Model Simulator
User36964
New Altair Community Member
Hi to all,
I use a decision tree model. I used two datasets (one is preprocessed, other is raw) to reveal the performance differences. Both datasets label the cases as 1 and controls as 0.
When I run my model with raw data set, the Model Simulator's output gives information about Prediction:0. Like "The outcome is most likely 0, but the model is not very confident. In fact, the confidence for this decision is only 80.00% ... "
Then I run the model with the preprocessed dataset. The Model Simulator's output gives information about Prediction:1. Like "The outcome is most likely 1, but the model is not very confident. In fact, the confidence for this decision is only 54.95%."
Is there a way to fix the model simulators output for only Prediction 1 ?
I use a decision tree model. I used two datasets (one is preprocessed, other is raw) to reveal the performance differences. Both datasets label the cases as 1 and controls as 0.
When I run my model with raw data set, the Model Simulator's output gives information about Prediction:0. Like "The outcome is most likely 0, but the model is not very confident. In fact, the confidence for this decision is only 80.00% ... "
Then I run the model with the preprocessed dataset. The Model Simulator's output gives information about Prediction:1. Like "The outcome is most likely 1, but the model is not very confident. In fact, the confidence for this decision is only 54.95%."
Is there a way to fix the model simulators output for only Prediction 1 ?
Tagged:
0
Answers
-
Hi,
have you set the class of highest interest in the second step?
BR,
Martin1 -
Yes, I choose the class of interest as Prediction:1 for both cases0