The Evaluation Imperative: Lessons from the K30 Programs - PowerPoint PPT Presentation

1 / 12
About This Presentation
Title:

The Evaluation Imperative: Lessons from the K30 Programs

Description:

Masters of Science in Clinical Investigation. University of Texas Health ... Provide a resource for the Clinical Research Education community to locate and ... – PowerPoint PPT presentation

Number of Views:20
Avg rating:3.0/5.0
Slides: 13
Provided by: lichte
Category:

less

Transcript and Presenter's Notes

Title: The Evaluation Imperative: Lessons from the K30 Programs


1
The Evaluation ImperativeLessons from the K-30
Programs
  • Michael J. Lichtenstein, M.D.
  • Program Director
  • Masters of Science in Clinical Investigation
  • University of Texas Health Science Center
  • San Antonio, TX

2
Survey of the impact of National Institutes of
Health clinical research curriculum awards (K30)
between 1999 and 2004Bakken LL, Lichtenstein M,
and the ACRTPD Evaluation CommitteeJournal of
Investigative Medicine 53(3)123-7, 2005
  • Purpose To determine the early capacity of the
    59 NIH K30 programs (funded 1999-2004) to produce
    clinical investigators trained in core clinical
    research skills.
  • Methods 37-item Web-based survey distributed to
    K30 programs in July 2004.
  • Results
  • 76 K30 programs (45/59) responded to this
    survey.
  • Average number of active trainees in each program
    was 32.
  • Women constitute 53 of active trainees, and 22
    of them were underrepresented minorities.
  • 96 of active trainees had medical degrees.
  • Average number of graduates over the 5-year
    funding period was 18.

3
Survey of the impact of National Institutes of
Health clinical research curriculum awards (K30)
between 1999 and 2004Bakken LL, Lichtenstein M,
and the ACRTPD Evaluation CommitteeJournal of
Investigative Medicine 53(3)123-7, 2005
  • Results continued
  • Of graduates, 50 women 17 underrepresented
    minorities.
  • 44 earned M.Sc. Degrees 13 earned other
    degrees.
  • 61 of K30 program graduates had some extramural
    funding to support their research.
  • Average number of publications per trainee for
    all trainees (active and graduate) was 2.3.
  • Conclusions
  • The K30 program is a catalyst at multiple
    institutions for improving the pedagogy for
    clinical research training
  • It successfully fulfilled the mandate set forth
    by the 1998 NIH Director's Panel on Clinical
    Research (Nathan Report).

4
Action Plan 4 To collect, review, and
disseminate tools and methods for program
evaluation (2005)
  • Provide a resource for the Clinical Research
    Education community to locate and adapt valid
    effective evaluation tools for their programs.
  • Ask the K-30 programs to submit any tools and
    instruments used to evaluate their programs
  • Specific course and/or program evaluation
    instruments request both qualitative and
    quantitative instruments
  • Background on development and testing of
    instruments
  • Ask permission to share evaluation tools with the
    ACRTPD membership.
  • Review the instruments.
  • Utilize the ACRT website to disseminate
    information about research training program
    evaluation (instruments and methodology).

5
Review of K30 Program Evaluation Methods
6
Review of K30 Program Evaluation Methods (Contd)
7
Review of K30 Program Evaluation Methods and
Impact
  • Low response rates from K30 Programs
  • Self-report information a limitation
  • No clear pattern across small sample of programs
    regarding
  • Types of evaluation processes
  • Outcomes Monitored

8
What are the Next Steps?Proposal for a Web-based
Portal
  • NCRR supports development of a web-based portal
    to collect a core set of predictor and outcome
    variables from each of the K30 programs
  • Predictor variables Examples gender,
    institution, department
  • Outcome variables Examples
  • Time to graduation
  • Time to first grant funding
  • Publication records (e.g. graduation rates).
  • Trainee information collected in a systematic
    standardized manner.
  • Following a protocol, the model would be similar
    to multi-center clinical trials, where personnel
    enter their site data directly into a database.

9
(No Transcript)
10
Reasons for Delayed Graduation
  • Part-time Matriculation
  • Leaves of Absence
  • Clinical Responsibilities
  • Maternity Leave
  • Delays in Research Projects
  • Slow subject accrual
  • Grant and contract support
  • Mentor and Trainee Interactions

11
K30 Program Wide Evaluation
  • Program Directors would have access to reports
    about their programs to help guide curriculum
    development, organization, and delivery.
  • The portal could be maintained by the NIH or
    contracted to another party (e.g., AAMC).
  • This approach could substitute for filing an
    annual report for each non-competing renewal
    complete data entry would be the criterion for
    obtaining funding from year to year within the
    grant cycle.

12
The Evaluation ImperativeQuestionsand
Discussion
Write a Comment
User Comments (0)
About PowerShow.com