Challenges in Classifying Adverse Events in Cancer Clinical Trials - PowerPoint PPT Presentation

1 / 26
About This Presentation
Title:

Challenges in Classifying Adverse Events in Cancer Clinical Trials

Description:

Challenges in Classifying Adverse Events in Cancer Clinical Trials Steven Joffe, MD, MPH Dave Harrington, PhD David Studdert, JD, PhD Saul Weingart, MD, PhD – PowerPoint PPT presentation

Number of Views:130
Avg rating:3.0/5.0
Slides: 27
Provided by: PartnersIn96
Category:

less

Transcript and Presenter's Notes

Title: Challenges in Classifying Adverse Events in Cancer Clinical Trials


1
Challenges in Classifying Adverse Events in
Cancer Clinical Trials
  • Steven Joffe, MD, MPH
  • Dave Harrington, PhD
  • David Studdert, JD, PhD
  • Saul Weingart, MD, PhD
  • Damiana Maloof, RN

2
Disclosure
  • Member of clinical trial adverse event review
    board for Genzyme Corp (not oncology-related)

3
Adverse Events in Clinical Trials
  • Adverse events (AEs) are critically important
    outcomes of clinical trials
  • Human subjects protection
  • Endpoints for judgments about benefits risks of
    study interventions
  • Captured on Case Report Forms
  • Reported to oversight agencies

4
Components of AE Assessment
  • Type
  • Severity
  • Relatedness to study agent(s)
  • Expectedness

5
Components of AE Assessment
  • Type
  • Severity
  • Relatedness to study agent(s)
  • Expectedness

6
Reporting Criteria(to Dana-Farber IRB)
  • Grade 5 (fatal)
  • Grade 4, unless specifically exempted
  • Grade 2/3, if unexpected AND possibly, probably
    or definitely related
  • Virtually identical to NCIs Adverse Event
    Expedited Reporting System (AdEERS) criteria

7
AE Grading in Oncology
  • NCIs Common Terminology Criteria for Adverse
    Events (CTCAE) typically used
  • Effort to standardize nomenclature
  • developed by consensus methods no formal process
    to establish reliability of grading

http//ctep.cancer.gov/protocolDevelopment/electro
nic_applications/ctc.htmctc_v30
8
Aims
  1. To assess the validity of physician reviewers
    determinations about whether AEs in cancer trials
    meet IRB reporting criteria
  2. To assess the interrater reliability of
    reviewers determinations about whether AEs that
    occur in cancer trials meet IRB reporting
    criteria
  3. To assess the validity and reliability of
    revie-wers judgments about the components of AEs

9
Study Methods
10
Panelists Roles
  • Review primary data from criterion sets of AEs
  • Rate each AE
  • Classification
  • Grade
  • Relatedness
  • Expectedness
  • Reportable to IRB


from CTCAE
11
Panelist Demographics
Expert Panel (n3) Second Panel (n10)
Years since fellowship training Years since fellowship training Years since fellowship training Years since fellowship training
Mean 20 yrs 6.3 yrs
Range 10 32 yrs 2 17 yrs
Academic rank Academic rank Academic rank Academic rank
Instructor / Asst Prof 1 10
Assoc Prof / Prof 2 0
12
Panelists Experience
Expert Panel (n3) Second Panel (n10)
Clinical trials served as overall Principal Investigator Clinical trials served as overall Principal Investigator Clinical trials served as overall Principal Investigator Clinical trials served as overall Principal Investigator
0 5 0 7
6 3 3
Clinical trials served as PI, site PI, or Co-Investigator Clinical trials served as PI, site PI, or Co-Investigator Clinical trials served as PI, site PI, or Co-Investigator Clinical trials served as PI, site PI, or Co-Investigator
0 5 0 2
6 20 0 3
gt20 3 5
13
Panelists Experience
Expert Panel (n3) Second Panel (n10)
Patients personally enrolled in a clinical trial during past 3 years Patients personally enrolled in a clinical trial during past 3 years Patients personally enrolled in a clinical trial during past 3 years Patients personally enrolled in a clinical trial during past 3 years
0 10 0 1
11 30 1 4
gt30 2 5
Adverse event reports personally filed with the IRB during past 3 years Adverse event reports personally filed with the IRB during past 3 years Adverse event reports personally filed with the IRB during past 3 years Adverse event reports personally filed with the IRB during past 3 years
0 10 1 6
11 30 0 1
gt30 2 3
14
Statistical Analysis
  • Validity of judgments regarding reportability to
    IRB
  • agreement with gold standard
  • Interrater reliability of raters judgments
  • Kappa coefficients

15
Results
16
Criterion Set of AEs
Type of AE Grade Related Expected Reportable
High triglycerides 4 Definite Y Y
Osteonecrosis 3 Definite Y N
Sensory neuropathy 1 Probable Y N
Cardiac ischemia 4 Possible Y Y
Rash 2 Probable Y N
Thrombosis 4 Unlikely N Y
High uric acid 4 Probable N Y
Cardiac dysfunction 2 Definite Y N
Thrombotic thrombo- cytopenic purpura 4 Possibly N Y
Renal failure 4 Definite Y Y
17
Validity of Judgments Regarding Reportability to
IRB
Adverse Event Not Reportable Reportable Agree
1. High triglycerides 0 10 100
2. Osteonecrosis 6 4 60
3. Sensory neuropathy 10 0 100
4. Cardiac ischemia 0 10 100
5. Rash 9 1 90
6. Thrombosis 0 10 100
7. High uric acid 0 10 100
8. Cardiac dysfunction 8 2 80
9. TTP 0 10 100
10. Renal failure 0 10 100
TOTAL 93
18
Interrater Reliability of Panelists Judgments
Judgment Kappa P value
Reportability 0.75 lt0.0001
Grade 0.52 lt0.0001
Relatedness 0.22 lt0.0001
Expectedness 0.88 lt0.0001
19
Role of Experience Rank
Kappa
20
Role of Experience Service as PI
Kappa
21
Role of Experience Number of AE Reports Filed
Kappa
22
Conclusions
  • Oncologists judgments about whether or not AEs
    require reporting to the IRB show high agreement
    with gold standard
  • Interrater reliability of oncologists judgments
    about components of AEs varies
  • High expectedness of AE need for reporting
  • Moderate grade of AE
  • Low relationship of AE to study agents

23
Limitations
  • Small sample sizes
  • Criterion set of AEs
  • Panel of physician reviewers
  • Generalizability of set of AEs
  • Reviewers may not reflect population of
    investigators who file AE reports
  • Judgments based on document review rather than on
    firsthand knowledge

24
Thoughts About Direction of Bias in Agreement
Statistics
  • Factors biasing towards less agreement
  • Reviewer experience
  • Factors biasing towards greater agreement
  • Standardized set of documents for review
  • Criterion set selected based on maximum agreement
    among expert panel reviewers

25
Implications
  • Judgments about AEs are complex
  • Human subjects efforts to enhance reliability,
    or to minimize reliance on judgments about
    causation, are needed
  • Science toxicity data from uncontrolled trials
    may be misleading
  • RCR education about need for reporting is
    important but insufficient

26
Acknowledgments
  • Debra Morley
  • Anna Mattson-DiCecca
  • Physician panelists
  • ORI
  • NCI
  • Milton Fund
Write a Comment
User Comments (0)
About PowerShow.com