Confidence interval calculation on performance

User: "Vorty"
New Altair Community Member
Updated by Jocelyn
Hello,

I see that for many classification performance metric, RapidMiner provides an estimation which I interpret as an interval of confidence around the value provided. However I have failed to find the exact calculation that is performed, in particular for the kappa value.

To be clear, my question seems similar to the one of @taghaddo from last December (see post there: https://community.rapidminer.com/discussion/54694/error-range-of-classifier), which as far as I can tell, hadn't been answered to. Would anyone be able to clarify this point? Maybe by posting a snippet of the actual source code for that calculation as it seems to be sometimes done (e.g. for the kappa calculation there https://community.rapidminer.com/discussion/54909/regarding-kappa-value-in-cross-validation)?

Many thanks,
François


Find more posts tagged with

Sort by:
1 - 1 of 11
    User: "lionelderkrikor"
    New Altair Community Member
    Accepted Answer
    Hi François,

    It is because RapidMiner is using, in this particular case, a Cross Validation (See the Help section of this operator).
    With this technique, RM is training and testing k models (according to the number of folds you set)
    Thus it obtains k performances. Then RM calculates the average and the standard deviation of this k performances.
    So the displayed values are  : mean +/- standard deviation

    Regards,

    Lionel