Altair RISE
A program to recognize and reward our most engaged community members
Nominate Yourself Now!
Home
Discussions
Community Q&A
Confidence or Prediction Intervals
npapan69
Dear All,
When reporting a performance metric (ie AUC) of a model that was trained on a single data set and tested on a hold-out set, what is the proper way to assess its variance? Calculating the confidence intervals of the AUC or the prediction intervals?
Many thanks
Nikos
Find more posts tagged with
AI Studio
Performance
Accepted answers
All comments
varunm1
Hello
@npapan69
This is a bit tricky as they both are centered around the same value, but a prediction interval is wider than the confidence interval. So in case of error reduction or a better accurate prediction you can go with prediction intervals.
npapan69
Thanks Varunm1 for your answer. Is the prediction interval provided by any of the rapidminer operators?
varunm1
Hello
@npapan69
I am checking that, and I see no option in RM for that. I think it is a bit complicated to calculate.
@mschmitz
or
@IngoRM
any comments on this.
Thanks
Quick Links
All Categories
Recent Discussions
Activity
Unanswered
日本語 (Japanese)
한국어(Korean)
Groups