"sensitivity/specificity of confusion matrix vs ROC curve"
sozekes
New Altair Community Member
Hi,
We used 10-fold cross validation to classify 55 patients as responders(R)/nonresponders(NR).
Using Neural Network, Rapid classified the data set with 86.67% accuracy, generated 0.933 sensitivity and 0.8 specificity, according to the confusion matrix as seen in figure below.
Rapid also plotted the ROC curve after classification.
The intersection point of sensitivity(TP rate) and 1-specificity(FP rate) is supposed to be 0.933 to 0.2 on the ROC curve (pointed with a green dot on the curve).
But as you can see the red curve gives 0.85 sensitivity for the 0.2 (1-specificity) value (blue dot).
As far as we know the sensitivity-specificity values ratio must be coherent with the ROC curve.
Could you please let us know what you are thinking about this.
Thanks in advance
We used 10-fold cross validation to classify 55 patients as responders(R)/nonresponders(NR).
Using Neural Network, Rapid classified the data set with 86.67% accuracy, generated 0.933 sensitivity and 0.8 specificity, according to the confusion matrix as seen in figure below.
Rapid also plotted the ROC curve after classification.
The intersection point of sensitivity(TP rate) and 1-specificity(FP rate) is supposed to be 0.933 to 0.2 on the ROC curve (pointed with a green dot on the curve).
But as you can see the red curve gives 0.85 sensitivity for the 0.2 (1-specificity) value (blue dot).
As far as we know the sensitivity-specificity values ratio must be coherent with the ROC curve.
Could you please let us know what you are thinking about this.
Thanks in advance
0
Answers
-
Hi,
what happens if you switch to another way of estimating the AUC? You selected optimistic...
Greetings,
Sebastian0