Chris%20Borgmeier,%20PhD - PowerPoint PPT Presentation

About This Presentation
Title:

Chris%20Borgmeier,%20PhD

Description:

Chris Borgmeier, PhD. Portland State University. Ways of Knowing & Research Based Practices – PowerPoint PPT presentation

Number of Views:119
Avg rating:3.0/5.0
Slides: 27
Provided by: pbwo948
Category:

less

Transcript and Presenter's Notes

Title: Chris%20Borgmeier,%20PhD


1
Ways of Knowing Research Based Practices
  • Chris Borgmeier, PhD
  • Portland State University

2
  • We associate truth with convenience, with what
    most closely accords with self-interest and
    personal well-being, or promises best to avoid
    awkward effort or unwelcome dislocation of life.
    We also find highly acceptable what contributes
    most to self-esteem.
  • John Kenneth Galbraith

3
  •  
  • Conventional wisdom must be simple, convenient,
    comfortable, and comforting not necessarily
    true.
  • Steven Levitt

4
Ways of Knowing
  • Personal experience
  • Research can stimulate, inform, reinforce,
    challenge question our own experiences to
    enhance professional judgment
  • Tradition
  • Simply accept what has been done as the bet or
    right way (eliminates the need to search for
    knowledge understanding)
  • Authority
  • People considered to experts or authorities are
    major sources of knowledge
  • Challenge these ways of knowing are primarily
    idiosyncratic, informal influenced heavily by
    subjective interpretation

5
Ways of Knowing
  • Research
  • Involves a systematic process of gathering,
    interpreting and reporting information
  • Disciplined inquiry characterized by accepted
    principles to verify that claim is reasonable

6
Types of Research
  • Basic Research formulates refines theories
  • Applied Research improves practice solves
    practical problems
  • Action Research goal is to solve a specific
    classroom or school problem, improve practice or
    help make a decision at as single site

7
What to look for in articles
  • Refereed v. Non-refereed articles
  • Refereed articles reviewed by panel of
    peers/experts
  • Non-refereed not reviewed by experts
  • Pay Journals pay to have information published
  • Primary source original articles or reports in
    which researchers communicate directly the
    methods results of their study
  • Need to then evaluate the methods used in the
    study
  • Secondary source reviews, summarizes or
    discusses research conducted by others
  • Commentary/opinion

8
Quantitative Qualitative Research
  • Based on different assumptions about how to best
    understand and come to know what is true
  • Quantitative emphasizes numbers, measurement,
    deductive logic, control experiments
  • Qualitative emphasizes natural settings,
    understanding, verbal narratives, and flexible
    designs

9
Quantitative Research
  • Experimental Research
  • Investigators have control over 1 or more
    variables manipulate 1 factor to see if it has
    an impact on student behavior
  • Can be used to identify Causal relationships
  • True Experimental design random assignment
  • Quasi-experimental design no random assignment
  • Single Subject design experiment with a single
    person or a few individuals

10
Randomized Control Trials
  • Gold Standard for evaluating an interventions
    effectiveness
  • Studies that randomly assign individuals to an
    intervention group or to a control group, in
    order to measure the effects of the intervention
  • Advantage allows evaluation of whether the
    intervention caused the outcomes, as opposed to
    other factors

11
Quantitative Research
  • Non-experimental Research no experimental
    manipulation or experimental control of factors
    that may influence subjects
  • Usually because events already occurred, or
    because they cant be manipulated
  • Means research can only describe something or
    identify relationships between variables cannot
    determine causation
  • Descriptive info. about frequency or amount of
    something
  • Comparative examine differences between groups
    on target variable
  • Correlational investigate relationships between
    2 variables
  • Is there a relationship between

12
Single Subject DesignExample
  • 3 middle school students
  • Measure on-task behavior in 15 sec. intervals
    (momentary time sampling) during first 10 min. of
    class
  • Intervention Greet at door saying students name
    positive comment

13
Evaluating a Research Study
  • Quantity
  • One study is only one study (unless its a
    meta-analysis)
  • Convergence of evidence required
  • Quality
  • Type of Research Design
  • Sample (size match)
  • Measures (really measure important change?)

14
Ask a Faculty member
15
Collaborative Problem Solving
  • Visit the website
  • http//www.livesinthebalance.org/
  • What do I notice?
  • A canoe?
  • Advertising products for purchase
  • Lots of testimonials
  • Little bit of research (6 studies listed)
  • lets take a closer, evaluative look at the
    research

16
Collaborative Problem Solvingdata from CPS
website on 6/18/12
  • Johnson, M., Ostlund, S., Fransson, G., Landgren,
    M., Nasic, S., Kadesjo, B., Gillberg, C., and
    Fernell, E. (2012).  Attention-Deficit/Hyperactivi
    ty Disorder (ADHD) with Oppositional Defiant
    Disorder (ODD) in Swedish Children  An Open
    Study of Collaborative Problem Solving.  Acta
    Paediactrica, in press. 
  • Ollendick, T. H. (2011). Invited Address
    Effective Psychosocial Treatments for Emotional
    and Behavioral Disorders in Youth. University of
    Stockholm, Sweden..
  • Fraire, M., McWhinney, E., Ollendick, T.
    (2011). The effect of comorbidity on treatment
    outcome in an ODD sample. In T. Ollendick
    (Chair), Comorbidities in children and
    adolescents Implications for evidence-based
    treatment. Symposia presented at the
    41st European Association for Behavioral and
    Cognitive Therapies, Reykjavik, Iceland.
  • Halldorsdottir, T., Austin, K. Ollendick, T.
    (2011). Comorbid ADHD in children with ODD or
    specific phobia Implications for evidence-based
    treatments. In T. Ollendick (Chair),
    Comorbidities in children and adolescents
    Implications for evidence-based
    treatment. Symposia presented at the
    41st European Association for Behavioral and
    Cognitive Therapies, Reykjavik, Iceland.
  • Epstein, T., Saltzman-Benaiah, J. (2010).
    Parenting children with disruptive behaviors
    Evaluation of a Collaborative Problem Solving
    pilot program. Journal of Clinical Psychology
    Practice, 27-40.
  • Martin, A., Krieg, H., Esposito, F., Stubbe, D.,
    Cardona, L. (2008). Reduction of restraint and
    seclusion through Collaborative Problem Solving
    A five-year, prospective inpatient study.
    Psychiatric Services, 59(12), 1406-1412.
  • Greene, R.W., Ablon, S.A., Martin, A. (2006).
    Innovations Child Psychiatry Use of
    Collaborative Problem Solving to reduce seclusion
    and restraint in child and adolescent inpatient
    units. Psychiatric Services, 57(5), 610-616.
  • Greene, R.W., Ablon, J.S., Monuteaux, M., Goring,
    J., Henin, A., Raezer, L., Edwards, G., Markey,
    J., Rabbitt, S. (2004). Effectiveness of
    Collaborative Problem Solving in affectively
    dysregulated youth with oppositional defiant
    disorder Initial findings. Journal of Consulting
    and Clinical Psychology, 72, 1157-1164.
  • Greene, R.W., Biederman, J., Zerwas, S.,
    Monuteaux, M., Goring, J., Faraone, S.V. (2002).
    Psychiatric comorbidity, family dysfunction, and
    social impairment in referred youth with
    oppositional defiant disorder. American Journal
    of Psychiatry, 159, 1214-1224.
  • Greene, R. W., Beszterczey, S. K., Katzenstein
    T., Park, K., Goring, J. (2002). Are students
    with ADHD more stressful to teach? Patterns of
    teacher stress in an elementary school sample.
    Journal of Emotional and Behavioral Disorders,
    10, 27-37.

NO studies have been conducted in school
settings all research is either with parents or
in-patient clinical settings
These studies to not evaluate effectiveness of CPS
17
Evaluating a Research Study
  • Abstract
  • Introduction Literature Review
  • Research Questions
  • Method Design
  • Subjects Settings / Measures/ Procedures
  • Results
  • Discussion Conclusions
  • References

18
Evaluating the Research studies
2004 Greene et al 2006 Greene , Ablon, Martin 2008 Martin et al 2010 Epstein Saltzmann
Subjects Age 47 kids w ODD 4-12 yrs. old 3-14 yrs. old School-age Kids w ODD Under 12 yrs.
Settings Outpatient MH clinic _at_ hospital Inpatient Psyc unit (13 beds) Inpatient Psyc unit (15 beds) Outpatient clinic
Procedure Compare CPS w parent training (PT) group Trained unit staff (pre/post) Trained unit staff (pre/post) Group CPS parent training (pre/post)
Outcome measure ODDRS (unpublished rating scale created by Greene Improvement ratings (maternal therapist) Restraints seclusion Restraints Seclusions Eyberg Child Beh. Inv. Parent Stress Index
Outcome Improved slightly more than PT Reduced Reduced Improvement pre to post
19
What does the research tell us?
  • So what do we know?
  • Based on 4 evaluation studies
  • All include children ages 12 or less (2008 study
    does not specify an age range simply school
    age)
  • 2 are in inpatient psychiatric hospitals
  • 1 is an outpatient mental health clinic
  • 1 is a parent training program
  • in school settings 0
  • The research tells us nothing about the efficacy
    of CPS in school settings

20
What does the research tell us?
  • Outcome measures
  • ODD Rating Scale (unpublished assessment created
    by the author) improvement ratings from parent
    therapist
  • Similar scores to parent training
  • Reductions in restraint seclusion (Pre/Post)
  • Is this due to student behavior change or adult
    behavior change?
  • Eyberg CBI Parent Stress Index (Pre/Post)
  • No studies directly measure changes in student
    behavior

21
Concerns
  • Only 4 research studies evaluating CPS in 4 years
  • 2 on parent training (1 individual training 1
    group training)
  • 2 in inpatient psyc facilities
  • Make sure research you are looking at takes place
    in settings that match your application
  • E.g. school settings v. treatment centers
  • 2 of 4 studies have been conducted by the author
    of the program
  • Concern if authors are benefiting financially
    from sale of the program

22
School-wide PBIS Lets compare!
  • www.pbis.org
  • Click on Resource Catalog
  • Then Literature List
  • Evidence Base for SW-PBIS
  • Randomized Control Trials

23
Randomized Control Trials of SW-PBIS
  • Tier 1/ Universal SW-PBIS
  • Bradshaw, C. P., Mitchell, M. M., Leaf, P. J.
    (2010).Examining the effects of school-wide
    positive behavioral interventions and supports on
    student outcomes Results from a randomized
    controlled effectiveness trial in elementary
    schools. Journal of Positive Behavior
    Interventions, 12(3), 133-148.
  • Bradshaw, C.,Koth, C., Bevans, K., Ialongo, N.,
    Leaf, P. (2008). The impact of school-wide
    positive behavioral interventions and supports
    (PBIS) on the organizational health of elementary
    schools.School Psychology Quarterly.
  • Bradshaw, C., Reinke, W., Brown, L., Bevans, K.,
    Leaf, P. (2008).Implementation of school-wide
    positive behavioral interventions and supports
    (PBIS) in elementary schools Observations from a
    randomized trial. Education and Treatment of
    Children, 31, 1-26.
  • Horner, R. H., Sugai, G., Smolkowski, K., Eber,
    L., Nakasato, J., Todd, A. W., Esperanza, J.,
    (2009). A randomized, wait-list controlled
    effectiveness trial assessing school-wide
    positive behavior support in elementary schools.
    Journal of Positive Behavior Interventions,
    11(3), 133-144.
  • Sprague, J., Biglan, A., et al (in progress).A
    Randomized Control Trial of SWPBS with Middle
    Schools.

24
Meta-Analysis
  • A statistical reviewing technique that provides a
    quantitative summary of findings across an entire
    body of research
  • The results of individual studies are converted
    to a standardized metric or effect size
  • These scores are then aggregated across the
    sample of studies to yield an overall estimate of
    effect size

25
Effect Size
  • Particular attention is given to the magnitude of
    the effect size
  • .80 large effect size
  • .50 moderate effect size
  • .20 small effect size
  • (Cohen, 1988)

26
Web Resources
  • What Works Clearinghouse
  • IES Practice Guides
  • IRIS Modules
  • Meta-Analyses Research Reviews
  • Many more
Write a Comment
User Comments (0)
About PowerShow.com