Validity - PowerPoint PPT Presentation

1 / 15
About This Presentation
Title:

Validity

Description:

The same measure may have many validities depending on intended uses ... For published instruments have sources where additional information can be found been cited? ... – PowerPoint PPT presentation

Number of Views:56
Avg rating:3.0/5.0
Slides: 16
Provided by: annpo
Category:

less

Transcript and Presenter's Notes

Title: Validity


1
Validity
  • Accuracy the extent to which the instrument
    measures what it purports to measure
  • The extent to which the proposed interpretation
    of a measurement is supported empirically and by
    other evidence (consequences)

2
Types of Validity
  • Content Validity
  • Construct Validity
  • Concurrent Validity
  • Predictive Validity

3
Content Validity
  • When a test is used to assess prior learning
  • Does the measuring instrument (test,
    questionnaire, inventory) cover the domain of
    material (content) that it is supposed to cover?
  • Does the test cover a representative sample of
    the behavior domain (skills, process) to be
    measured?
  • Can the test performance be considered free of
    irrelevant factors?

4
Assessing Content Validity
  • Define the domain of content and processes
  • Conduct a logical analysis of the domain
  • Select a pool of items with proper statistical
    characteristics to represent the concepts in the
    domain

5
Construct Validity
  • When purpose is to validate a theory or assess a
    human (psychological) trait (constructs inferred
    from unobservable)
  • Concern is with the constructs embedded in the
    measure (does the instrument measure the
    psychological trait(s) it is supposed to measure)
  • Focus on scientific inquiry (theory
    building)understanding the psychological
    constructs that account for human behavior,
    performance

6
Assessing Construct Validity of a Measuring
Instrument
  • Operationally define the construct
  • Specify relationships with other constructs
  • Process correlate measures with other measures
    (-1.00 to 1.00)
  • Convergent and discriminant validation
  • Tests/Items that measure different trait should
    not correlate with each other
  • Tests/Items that measure same trait should
    correlate with each other

7
Concurrent Validity
  • Validating a newly developed measure
  • Simultaneously administer new measure with an
    accepted measure of the same construct
  • Examine correlation coefficient (validity
    coefficient) between the measures
  • Concurrent validity coefficient the correlation
    between a given measure and some criterion
    performance (-1.00 to 1.00)

8
Predictive Validity
  • To predict performance or behavior
  • To select/place people into a program, position,
    task
  • To develop short test Test-substitution
  • Predictive validity coefficient the correlation
    between a given measure and some criterion
    performance (-1.00 to 1.00)

9
All Validity Questions Resolve Into Questions of
Construct Validity
Conceptual Framework Content Face
Validity Representativeness Criterion Predictive C
oncurrent Consequential
CONSTRUCT VALIDITY
10
Warnings about Validity
  • Interpretations, not measures (e.g., tests) are
    validated
  • The same measure may have many validities
    depending on intended uses
  • Validation is a process, not an end result
  • Validity coefficients are only estimates (a
    matter of degree)

11
Relation of Validity to Reliability
  • Reliability is a necessary but not sufficient
    condition of validity.
  • If a test does not measure whatever it measures
    consistently then it cannot be valid for any
    purpose
  • A test may measure with high degree of
    reliability, with no validity (usefulness for
    purpose for which it is intended)
  • A valid test is reliable

12
General Warnings about Reliability and Validity
  • Data quality by itself does not determine the
    degree to which results can be trusted
  • Beware of unjustified claims
  • Validity and reliability are characteristics of
    the data produced by measuring instrument, not of
    the instrument itself
  • Validity and reliability coefficients are only
    estimates
  • Both subject to same warnings about correlation

13
Evaluating Instruments
  • Have actual questions and/or directions been
    provided?
  • Are special formats, settings and/or restrictions
    described?
  • When appropriate are multiple methods used?
  • For published instruments have sources where
    additional information can be found been cited?

14
Evaluating Instruments (Cont)
  • If measuring sensitive matters is there reason to
    believe accurate data were obtained?
  • Does the instrumentation obtrude on or change any
    behaviors that were observed?
  • If collection and coding of data is highly
    subjective is there evidence that a similar
    result would be obtained if another researcher
    measured the same group, at the same time, with
    the same measure?
  • If instrument is designed to measure a unitary
    trait, does it have adequate internal consistency?

15
Evaluating Instruments (Cont)
  • For stable traits, is there evidence of temporal
    stability?
  • When appropriate is there evidence of content
    validity?
  • When appropriate is there evidence of empirical
    validity?
  • Is instrument adequate in light of the research
    purpose?
Write a Comment
User Comments (0)
About PowerShow.com