How to interpret a ROC plot?
I generate a ROC plot with the process given below and I get a ROC plot. I assume that the red line (ROC) is the proportion of TP against the proportion of FP but I can't understand what the blue line (ROC (Thresholds)) represents. Can anyone explain?
Regards,
Carlos
<operator name="Root" class="Process" expanded="yes">
<operator name="ExcelExampleSource" class="ExcelExampleSource">
<parameter key="excel_file" value="/Users/csoares/Documents/Ensino/DBM/Materiais/Catalog_multi_aula.xls"/>
<parameter key="sheet_number" value="4"/>
<parameter key="first_row_as_names" value="true"/>
<parameter key="create_label" value="true"/>
<parameter key="label_column" value="2"/>
<parameter key="create_id" value="true"/>
</operator>
<operator name="SimpleValidation" class="SimpleValidation" expanded="yes">
<parameter key="keep_example_set" value="true"/>
<parameter key="create_complete_model" value="true"/>
<operator name="NaiveBayes" class="NaiveBayes">
<parameter key="keep_example_set" value="true"/>
</operator>
<operator name="OperatorChain" class="OperatorChain" expanded="yes">
<operator name="ModelApplier" class="ModelApplier">
<parameter key="keep_model" value="true"/>
<list key="application_parameters">
</list>
</operator>
<operator name="ClassificationPerformance" class="ClassificationPerformance">
<parameter key="keep_example_set" value="true"/>
<parameter key="accuracy" value="true"/>
<list key="class_weights">
</list>
</operator>
<operator name="ROCChart" class="ROCChart">
<parameter key="use_model" value="false"/>
</operator>
</operator>
</operator>
</operator>
Find more posts tagged with
- The area under the curve (AUC): is the integral over the curve. Higher values translate to higher accuracy.
- The form of the curve: ideally the curve should be as smoother as possible. Large "jumps" indicate that the model is sensitive to small changes in the dataset. The initial jump is excepted.
hello @michaelgloven - so for these kind of fundamental data science background topics I usually go "old school" with books (yes paper). My go-to texts are "Data Mining for the Masses" by Dr. Matthew North, and "Predictive Analytics and Data Mining" by Kotu & Deshpande. Both are excellent and are full of explicit examples using RapidMiner. For your question about ROC curves, Chapter 8 of Kotu & Deshpande is all about Model Evaluation which starts with a long explanation of ROC.
Scott
Hi @michaelgloven,
Each point of the ROC curve is the rate of true positives (or proportion of TP as it called in the first post) vs the rate of false positives (proportion of FP) for a specific applied threshold on the confidence of the corresponding classifier.
The ROC (thresholds) curve just shows this confidence threshold (sometimes also called confidence cut).
Hopes this helps,
Best regards,
Fabian
I have the same question, I'm sure its a simple answer but can't find an explanation in the documentation.
Thanks!