ROC curve estimation - PowerPoint PPT Presentation

1 / 20
About This Presentation
Title:

ROC curve estimation

Description:

ROC curve estimation ... – PowerPoint PPT presentation

Number of Views:93
Avg rating:3.0/5.0
Slides: 21
Provided by: int123
Category:
Tags: roc | curve | estimation

less

Transcript and Presenter's Notes

Title: ROC curve estimation


1
ROC curve estimation
2
Index
  • Introduction to ROC
  • ROC curve
  • Area under ROC curve
  • Visualization using ROC curve

3
ROC curve
  • Originally stands for Receiver Operating
    Characteristic curve.
  • It is used widely in biomedical applications like
    radiology and imaging.
  • An important utility here is to assess
    classifiers in machine learning.

4
Example situation
  • Consider diagnostic test for a disease
  • Test has 2 possible outcomes
  • Positive or negative.
  • Now based on this we will explain the various
    notations used in ROC curves in the next slide.

5
Data distribution available
Pts without the disease
Pts with disease
Test Result
6
Threshold
Test Result
7
Some definitions ...
True Positives
Test Result
without the disease
with the disease
8
False Positives
Test Result
without the disease
with the disease
9
True negatives
Test Result
without the disease
with the disease
10
False negatives
Test Result
without the disease
with the disease
11
Confusion Matrix
  • Confusion matrix is defined as a matrix
    consisting of two rows and two columns.
  • The orientation of entries in the confusion
    matrix is as follows if say the confusion matrix
    is called CMat.
  • Then CMat11True Positives CMat12False
    Positives.
  • Similarly CMat21False Negatives and
    CMat22True Negatives.

12
2-class Confusion Matrix
True class Predicted class Predicted class
True class positive negative
positive (P) TP P - TP
negative (N) FP N - FP
  • Reduce the 4 numbers to two rates
  • true positive rate TP (TP)/(P)
  • false positive rate FP (FP)/(N)
  • Rates are independent of class ratio

13
Comparing classifiers using Confusion Matrix
True Predicted Predicted
True pos neg
pos 60 40
neg 20 80
True Predicted Predicted
True pos neg
pos 70 30
neg 50 50
True Predicted Predicted
True pos neg
pos 40 60
neg 30 70
Classifier 1 TP 0.4 FP 0.3
Classifier 2 TP 0.7 FP 0.5
Classifier 3 TP 0.6 FP 0.2
14
Interpretations from the Confusion matrix
  • The following metrics for a classifier can be
    calculated using the confusion matrix. These can
    be used for evaluating the classifier.
  • Accuracy (TPTN)
  • Precision TP/(TPFP)
  • Recall TP/(TPFN)
  • F-Score 2recallprecision/(recall precision)

15
ROC curve
16
ROC curve comparison
A poor test
A good test
17
Area under ROC curve (AUC)
  • Overall measure of test performance
  • Comparisons between two tests based on
    differences between (estimated) AUC
  • For continuous data, AUC equivalent to
    Mann-Whitney U-statistic (nonparametric test of
    difference in location between two populations)
  • Determines the accuracy of a classifier in
    machine learning.

18
AUC for ROC curves
AUC 100
AUC 50
AUC 90
AUC 65
19
Further Evaluation methods
  • ROC curve based visualization
  • The visualization of the ROC curve is a very good
    method of evaluating the classifier.
  • Tools like Matlab, Weka and Orange provide
    facilities to support visualization of the ROC
    curve.

20
  • ROCR is one such tool which provides effective
    visualization.
Write a Comment
User Comments (0)
About PowerShow.com