Strategies for the Evaluation of Adult Science Media - PowerPoint PPT Presentation

1 / 55
About This Presentation
Title:

Strategies for the Evaluation of Adult Science Media

Description:

Strategies for the Evaluation of Adult Science Media. www.rockman.com ... Michael Scriven. Getting to 'So What?' Policy and practice. More focus on research in RFPs ... – PowerPoint PPT presentation

Number of Views:19
Avg rating:3.0/5.0
Slides: 56
Provided by: jennifer393
Category:

less

Transcript and Presenter's Notes

Title: Strategies for the Evaluation of Adult Science Media


1
Strategies for the Evaluation of Adult Science
Media
Saul Rockman - Saul_at_rockman.comJennifer Borland
- Jennifer_at_rockman.comKristin Bass -
Kristin_at_rockman.comMonnette Fung -
Monnette_at_rockman.com
www.rockman.com San Francisco, CA Bloomington,
IN
2
Retrospective Review of Evaluations of Adult
Science Media
Saul Rockman ROCKMAN ET AL www.rockman.com Apri
l 14, 2009
3
Commissioned Paper
  • Media-Based Learning Science in Informal
    Environments (2007)
  • (http//tinyurl.com/REA_NRC_Paper)
  • Learning Science in Informal Environments People
    Places, and Pursuits (2009)
  • (http//tinyurl.com/NRC_Book)
  • Supported by NSF, NRC, National Academy of
    Sciences, Board on Science Education
  • A comprehensive synthesis of research on science
    learning in informal environments.

4
Review of the Literature
  • More than 50 studies from 1999 2006
  • Covers both adult and children programming
  • Much is fugitive literature, not accessible
  • Categorized by media form

5
Research Themes
  • Adult vs. Youth Differences between adult
    learning and youth learning.
  • Formal vs. Informal Differences between formal
    and informal learning.
  • Media vs. Non-mediated Learning from media vs.
    learning from non-mediated formats.

6
Media for Children vs. Adults
  • Childrens media
  • Series
  • Repetition
  • Iterative
  • Curriculum design
  • Intentional
  • Adult media
  • One-offs
  • Content not consistent
  • Informational
  • News focus

7
Adult Audience
  • Older, wealthier, whiter, more educated
  • Science, news, arts (persistent)
  • Increasingly using multiple media
  • Autonomous, self-directed, practical, looking for
    respect and relevance

8
Differences between formal and informal learning
Two Key Areas of Difference
Context for learning
  • When and why the learning is taking place
  • Locus of responsibility for learning
    (teacher-directed or learner directed)

Potential or desired learning outcomes
  • The goal of ISE is enabling the ideas and
    information to be integrated more fully into ways
    of thinking or ways of behaving
  • ISE not geared toward formal assessments

9
Differences between learning in mediated and
non-mediated formats
  • Mediated content can promote more self-directed
    learning and therefore deeper processing
  • Pacing varies in mediated formats (pros and cons)

Instructional Design
Cognitive Science
Communication Theory
10
Media
  • Television/video (including video files on the
    Internet)
  • Radio/audio (including podcasts and streamed
    audio)
  • Film
  • Large Format Film (e.g., IMAX)
  • Planetarium shows
  • Not Websites, print, brief videos in museums,
    etc.

11
Accessibility Continuum
Internet
Home Video Distribution
Highly Accessible
Broadcast Media
Location -Based
Limited Accessibility
12
The Lay of the Land
  • More than PBS / NPR
  • Attributes of science and nature programming

Voice of God
13
Why are Programs Like This?
  • Schedule drives design and production
  • Media requires significant funding
  • The money is in production
  • Built on values, not theory
  • Review process focuses on media, not outcomes

14
There would be no bucks without Buck Rogers.
Old NASA adage
ROCKMAN ET AL
15
Framework for Adult Learning From Media
  • Media Production Context
  • Individual-level Inputs
  • Activities
  • Outcomes
  • Short-term
  • Mid-term
  • Longer-term
  • Program Goals
  • External Influences

16
Summary of Outcomes
  • Limited range of outcomes
  • Methodological weaknesses
  • Limited generalizability (selection bias)

17
The difference between outputs and outcomes is
like the difference between what is so and so
what?
  • Michael Scriven

ROCKMAN ET AL
18
Getting to So What?
  • Policy and practice
  • More focus on research in RFPs
  • Enhanced funding
  • More creative research approaches
  • More powerful research methodologies
  • Interactive multiple media strategies

19
Of course it works in practice, but will it work
in theory?
  • French research saying

20
Informal Science Evaluation MethodologiesJennife
r BorlandJennifer_at_rockman.com
21
Four Main Categories of Evaluation Outcomes
Learn Feel Think Do
22
Methods Mapped to Outcomes
  • Learning self-reports, recall, little
    application of learning
  • Engagement self-reports, appeal associated with
    regular viewing/listening
  • Attitude change short term, increased interest,
    rarely a control group
  • Behavior information seeking, discussions

23
Learning
  • Self-rated level/amount of learning
  • Subjects answer questions related to content
    learning
  • Observable learning outcomes
  • (applied knowledge)

Methods to Consider pre-post testing,
assessments of higher order thinking skills
control/treatment groups, better sampling,
transfer tasks, more longitudinal assessment
24
Engagement/Enjoyment
  • Indicators
  • Attention
  • Appeal
  • Why more/less attention and appeal? (causes and
    correlates)

Methods to Consider Measures of physiological
responses, better sampling and instrument
construction to facilitate multivariate analysis
25
Attitude Change
  • Self-Reported Change
  • Towards science in general
  • Science content in program

Methods to Consider More in-depth assessments
and longitudinal studies
26
Behavior
  • Intended
  • Actual

What did you do?
  • Level of depth varies

Methods to Consider More observed behavioral
change, longitudinal behavioral change,
randomized control/treatment studies
27
Challenges
  • Funding More rigorous evaluations cost more -
    The money needs to be on the screen
  • Timing Interest in quick-results, move on to
    next thing
  • Logistics IRB approval, sample selection
  • Lack of buy-in necessary evil, luxury more than
    necessity, good reviews/ratings are perceived to
    count for more
  • Reality a thirty minute program isnt going to
    change someones life dramatically

28
Solutions/Suggestions
  • Given adequate funding/timeframes
  • Better/more rigorous and powerful methodologies
  • Control Group Studies
  • Panel Studies
  • Better Samples/Strategic Audience Sampling
    (include non-traditional and reluctant audiences)
  • Longitudinal Studies
  • Multivariate analysis (finding new connections)
  • New/Unique methods
  • Specific to informal learning (different from
    formal)
  • Beyond the individual (dyads, groups, etc.)
  • New Modes of Informal Learning Web, Games,
    Mobile, etc.
  • Theory-based/Theory building psychology, mass
    communication, cognitive science

29
Assessing Knowledge of Exploring Time
Kristin Bass Kristin_at_rockman.com
www.rockman.com San Francisco, CA Bloomington,
IN
30
New Directions in ISE Evaluation
  • Rigor
  • Design
  • Instrumentation
  • Practicing what we preach

31
  • TV show with accompanying website
    (http//www.exploringtime.com/)
  • Program objectives
  • Program format
  • .

32
Assessing Learning
33
Construct Identification
34
Item Generation
  • Prior evaluations
  • Show producers
  • Script

35
Item Example 1
  • Please describe what is changing in the following
    scene.

36
Sample Item 2
In order to explain why a heart muscle goes into
arrhythmia, scientists have to drill down to a
chain of events in the thousandths of a second.
Why is this?
37
Item Scoring
  • Top-down
  • Bottom-up
  • Pilot responses
  • Pre and post responses
  • Inter-rater reliability

38
Review and Validation
  • Pilot response results
  • Scoring agreement
  • Patterns of responses

39
Results
  • Improved awareness of time
  • Improved timescale identification
  • No change in understanding of adjacent timescales

40
Lessons Learned
  • Begin at the beginning.
  • Need items? Use the script.
  • Budget enough time.
  • Remember less is more.

41
Future Directions
  • Group assessments
  • Authentic assessments

42
Survey and Panel Studies of Quest Science
Programming
Monnette Fung monnette_at_rockman.com
www.rockman.com San Francisco, CA Bloomington,
IN
43
QUEST
Radio Television Community Science
Blog Original Web Content
44
Audience Study
Year 1 2007 Baseline surveys in Spring and
Fall Year 2 2008 Panel surveys in June,
August and October New Media Users survey in
September Year 3 2009 Educator Case
Studies Continue New Media Users Survey
45
Recruiting
High Engagement Members Recent visitors
Content consumers Medium/Family Engagement
Families with children under 16 Lower
Engagement Non-members Infrequent visitors
Interest in arts
46
Participants
47
Repeated Question
In the last two months, have you participated in
any of the following science/nature-related
activities? (Check all that apply)
Visited a science museum or nature
organization Attended a lecture at a
science/nature organization Attended a science
café Taken a science or nature-related class or
workshop Participated in a nature
walk Participated in another science/nature-relate
d activity
48
Unique Questions
Survey 2
  • Participation in arts-related activities (e.g.,
    visits to art museums, performances, or classes)
  • Approximate number of times, and is this typical?
  • Describe a recent science/nature activity
  • What was the activity?
  • With whom did you engage?
  • Why did you engage?

49
Unique Questions
Survey 3
50
Unique Questions
Survey 3
Which option below best describe your engagement
with QUEST online video content?
I have no watched QUEST video online, and I am
not interested in doing so. I have not done so,
but I might in the future. I have done so, but I
will not do so again. I have done so, and I will
continue to do so.
51
Findings
  • Collectively, engagement was steady
  • Families are most engaged
  • Environment-specific activities most popular
  • Potential for engaging high arts group
  • Growth of new media audience

52
New Media Survey
Recruiting Links and aggregators Findings
Half were outside Bay Area On-air
broadcasts are still primary medium
53
Year 3
Educator Case Studies Formal informal
educators New media for Science teaching
learning New Media Users Survey Continue
with the same survey
54
Things to Consider
Influence on participant behavior Increased
awareness
55
Questions?
Write a Comment
User Comments (0)
About PowerShow.com