AutoModel Performance not Matching Confusion Matrix

User: "DennisBaloglu"
New Altair Community Member
Updated by Jocelyn
Hello,
For the AutoModel, I look at the output for the confusion matrix and performance measures. However, when calculating the performance measures by hand using the confusion matrix, it doesn't match up to the performance measures listed. Am I overlooking something?

Find more posts tagged with

Sort by:
1 - 1 of 11
    User: "lionelderkrikor"
    New Altair Community Member
    Accepted Answer
    Hi @DennisBaloglu

    Yes, this slightly difference is expected : 

    The displayed performance is (by default) a "multi hold out set validation" method on the 40% of the dataset which are not used to train the model.
    Then this "test set" is divided in 7 parts, and 7 performances are calculated.
    Then AutoModel remove the maximum performance and the minimum performance (the outliers) and the average performance is calculated on the 5 remaining performance.
    Thus this calculated performance can slightly differ from the performance calculated from the confusion matrix.

    To retrieve the performance calculation methodology, on the results panel (the final screen), you can click on the "information mark" an go to "Models" -> "Performance" to see the description of this methodology

    Hope this helps,

    Regards,

    Lionel