External Quality Assessment for AFB Smear Microscopy - PowerPoint PPT Presentation

1 / 56
About This Presentation
Title:

External Quality Assessment for AFB Smear Microscopy

Description:

definition & size of a lot: lab annual turn ... Sample Size Determination Example ... Note: the sample size does not vary considerably when the annual workload ... – PowerPoint PPT presentation

Number of Views:161
Avg rating:3.0/5.0
Slides: 57
Provided by: allistr
Category:

less

Transcript and Presenter's Notes

Title: External Quality Assessment for AFB Smear Microscopy


1
External Quality Assessment for AFB Smear
Microscopy
  • Blinded Rechecking

2
WHY THIS SYSTEM?
  • Not to
  • confirm diagnoses
  • rate individual technicians / labs
  • prove that controllers are better

3
RECHECKING MERITS
  • Final aim improve quality
  • screening for unacceptable performance
  • problem solving
  • Control of routine smears reality
  • Continuous motivation and education

4
RECHECKING PROBLEMS
  • Disadvantage workload!
  • Impossible / useless without good program
  • regular supervision visits
  • highly functional network intermediate labs !!
  • corrective measures

5
OVERVIEW
  • First step correct identification
  • of possible problem centres
  • LQAS sampling for least effort
  • not sure that all failing are weak
  • feed-back visit evolution type / nos.
    error
  • of type of problem FP or FN ?
  • correct technique for QA !
  • Laos example

6
(No Transcript)
7
OVERVIEW
  • Second step investigation for causes
  • False positives
  • lack of training / experience
  • very poor microscope
  • administrative errors
  • (crystals, artifacts)
  • False negatives
  • poor stain or staining technique
  • very thick smear
  • poor microscope / poor light
  • poor examination or overload

8
(No Transcript)
9
ORGANIZATIONAL ASPECTS
  • Integrated with supervision system
  • Feed-back for maximum reliability
  • Complementary levels of control
  • Workload depends on
  • total turn-over positivity-rate of the labs
  • level of decentralization
  • desired statistical accuracy

10
SAMPLING PRINCIPLES
  • As few as possible
  • overload is counterproductive
  • balance statistical and technical accuracy
  • False positives none, or systematic error
  • statistics not required for significance
  • ---gt check only few
  • at frequency of occurrence
  • controllers / controlled directly comparable

11
(No Transcript)
12
SAMPLING PRINCIPLES (2)
  • False negatives a few are unavoidable
  • target minimal sensitivity
  • ---gt Lot Quality Assurance Sampling
  • critical value of false negative surpassed?
  • Include low positives (scanty IUATLD/WHO)
  • difficult to confirm, but sensitive

13
TECHNICAL ISSUES
  • Sampling random and representative
  • all slides kept, properly identified
  • sampling by supervisor, not technician!!
  • select from register
  • check on completeness of collection
  • calculate sampling interval

14
TECHNICAL ISSUES (2)
  • Screening first controllers
  • blind checking absolutely necessary
  • avoid to overload the controllers
  • better sacrifice statistical power
  • max. 10 slides per day ( other work)
  • same technique (i.e. no. of fields) good
    microscopes

15
TECHNICAL ISSUES (3)
  • Re-staining
  • fading of fuchsin colour
  • true positives --gt false positives
  • very fast under extreme conditions
  • always restain discordants

16
(No Transcript)
17
(No Transcript)
18
TECHNICAL ISSUES (4)
  • Re-staining before first screening no consensus
  • effect of fading more extreme on low positives
  • so false negatives ---gt undetectable by
    controllers
  • poorly stained AFB fade faster
  • effect not only in extreme climates
  • restain not to miss gross deficiencies of
    stain/staining
  • so better restain all slides before first
    controls ??

19
(No Transcript)
20
TECHNICAL ISSUES (5)
  • Gold standard ???
  • controllers also make errors
  • countercheck second control on discordants only
  • higher level, or head technician
  • no new bias false positive --gt false negative
    controller
  • false negative usually confirmed
  • may require long search
  • re-stain if not done earlier
  • blinding only to origin of both results

21
TECHNICAL ISSUES (6)
  • Controllers must also be evaluated !
  • sample from their routine work ??
  • First controllers compare FN rates with
    periphery
  • second control results ---gt error by whom? what
    type?
  • rates denominators numbers of reported results
  • only for sum of controlled vs. controller
  • validity problem if
  • no errors for a bigger number of centres
    together
  • controller made more FN than periphery
  • Evaluation second controllers
  • send back slides with serious errors

22
SAMPLE SIZE
  • Sample size LQAS method
  • from industry, smallest sample
  • one-sided test no exact results, wide confidence
    limits
  • except for bigger area, i.e. country
  • example 3 false negative when sample size 90
  • 1 centre 0.8 to 9.1
  • average 12 centres 2.1 to 4.2
  • average 850 centres 2.9 to 3.1
  • probably not more than x error in the lot
  • accept if not more than d error in the sample
  • reject if more than d error in the sample
  • but not sure to be the reverse !!

23
SAMPLE SIZE (2)
  • LQAS sample size tables parameters
  • acceptance number "d" ( specificity, power)
  • confidence level (95)
  • definition size of a lot
  • lab annual turn-over recommended
  • critical value false negative calculate using
  • positivity rate of labs
  • acceptable lower limit of sensitivity

24
(No Transcript)
25
Critical value
  • An upper threshold of the proportion of false
    negatives
  • To be chosen from an estimate
  • Historical false negative rates
  • Calculation based on
  • Prevalence of positives
  • Expected parameters of sensitivity and
    specificity comparing to controller.

26
(No Transcript)
27
(No Transcript)
28
(No Transcript)
29
Blinded Rechecking
  • -Determine sample size
  • -Properly store slides
  • -Collect a random representative sample
  • -Ensuring blinding
  • -Recheck smears
  • -Resolve discrepancies between original result
    and result of rechecking laboratory
  • -Feedback to province, district and local labs

30
Determining Sample Size
  • Slide Positivity Rate (SPR)
  • The number of AFB smears among all slides
    (diag. and monit.) in the laboratory from which
    the sample is being taken.
  • This number is estimated using the lab registers
    from the previous year or the preceding four
    quarters.
  • Sample sizes can be set using the average
    positivity rate for laboratory, region, or
    country.

31
Slide Positivity Rate (SPR)
  • SPR
  • (Number of positive smears per year/Annual slide
    volume) X 100

32
  • SENSITIVITY
  • The sensitivity, as defined here, is the
    detection of all positives, including low
    positives (1-9 AFB/100)
  • The expected performance in detecting positives,
    as compared to the microscopists rechecking the
    slides.
  • Acceptable sensitivity is determined by the
    National Reference Laboratory ( NRL).
  • A sensitivity of 75-80 is recommended

33
Determination of annual sample size
34
Sample Size Determination Example
35
Sample Size Determination Example
36
Sample Size Determination Example
Note If variation in slide volume or positivity
rate among the centers in a supervisors' area is
considered to be excessive, a few choices
depending on the ranges of volume and positivity
rate may be given. In areas with extreme
variability, collectors might even be given a
list with individual sample sizes per laboratory
based on each laboratorys performance the
previous year.
37
Sample Size Determination Example
Note the sample size does not vary considerably
when the annual workload exceeds 1000 therefore
the rounding off will not affect the calculations.
38
Sample Size Determination Example
39
Sample Size Determination Example
40
Sample Size Determination Example
41
Sample Size Determination Example
42
Collecting Slides (Sampling)
  • Slide Storage
  • Slide Collection
  • Slide Selection

43
  • It is recommended that one quarter of the total
    sample size be collected during the quarterly
    supervisor visit.

44
Sampling Example
  • Laboratory Register
  • Begin with a random number - consider using
    digit from currency (KZT)
  • Example - sample every 10th slide

45
(No Transcript)
46
ANALYSIS OF RESULTS
  • Numbers of errors
  • Grading of errors
  • cut-off for positivity serious / minor error
  • HFP, HFN vs. LFP, LFN
  • priority for intervention serious errors
  • quantification errors secondary
  • indicative of stain problems if restaining was
    done

47
(No Transcript)
48
ANALYSIS OF RESULTS
  • Identify possibly poor centres
  • FN acceptance number surpassed? HFP?
  • Feed-back supervision visit
  • listing slides with serious errors
  • identify and solve problems
  • further validation (especially FP)
  • check on second level controllers

49
(No Transcript)
50
Interpretation
  • If there are no errors, the performance goal has
    been met. (95 confidence level)
  • Even though the sample size is based on
    sensitivity, it is still likely that one or more
    errors will be found even in laboratories that
    are performing at or above the expected level.
  • If errors are detected, the interpretation and
    appropriate action may be different depending on
    the number and type of error, as well as the
    resources and capacity of the program.

51
LIMITATIONS OF THE SYSTEM
  • False alarm chance errors
  • LFP always doubtful
  • (ignore when few equally distributed)
  • Deliberate cheating ....

52
CAUSES OF ERRORS
  • Many HFP, also many HFN (nonsense results)
  • does not know at all
  • gross technical problem (microscope?)

53
CAUSES OF ERRORS (2)
  • HFP
  • single administrative mistake
  • few, with some LFP
  • poor identification, registration
  • confusing AFB with artefacts
  • (contaminated stain)

54
CAUSES OF ERRORS (3)
  • LFP
  • ignore if just a few equally divided
  • majority of scanty is false, plus some HFP
  • confusing AFB with artefacts
  • excessive decentralization ?

55
CAUSES OF ERRORS (4)
  • HFN ( LFN)
  • poor stains
  • poor technique (smearing, staining)
  • poor equipment (light, fungus)
  • often poor work/overload
  • rarely poor training
  • chance finding (by exclusion follow-up)

56
CAUSES OF ERRORS (5)
  • QE
  • little experience
  • poor work/overload
  • poor equipment (light, fungus)
  • may point to
  • stain(ing) problems (if re-stained for controls)
    ---gt too low
  • fading problem (if not re-stained) ---gt too high
Write a Comment
User Comments (0)
About PowerShow.com