How to Change Prediction Graphs in Model Simulator

User: "User36964"
New Altair Community Member
Updated by Jocelyn
Hi to all,

I use a decision tree model. I used two datasets (one is preprocessed, other is raw) to reveal the performance differences. Both datasets label the cases as 1 and controls as 0.  

When I run my model with raw data set, the Model Simulator's output gives information about Prediction:0. Like "The outcome is most likely 0, but the model is not very confident. In fact, the confidence for this decision is only 80.00% ... "

Then I run the model with the preprocessed dataset. The Model Simulator's output gives information about Prediction:1. Like "
The outcome is most likely 1, but the model is not very confident. In fact, the confidence for this decision is only 54.95%."

Is there a way to fix the model simulators output for only Prediction 1 ?  

Find more posts tagged with

Sort by:
1 - 2 of 21
    Hi,
    have you set the class of highest interest in the second step?

    BR,
    Martin
    User: "User36964"
    New Altair Community Member
    OP
    Yes, I choose the class of interest as Prediction:1 for both cases