Bias in Epidemiological Studies - PowerPoint PPT Presentation

1 / 54
About This Presentation
Title:

Bias in Epidemiological Studies

Description:

Study methods may change over time. Different investigators may examine ... Study design most susceptible to bias. Separate sampling of cases and controls ... – PowerPoint PPT presentation

Number of Views:4303
Avg rating:5.0/5.0
Slides: 55
Provided by: jxbr
Category:

less

Transcript and Presenter's Notes

Title: Bias in Epidemiological Studies


1
Bias in Epidemiological Studies
Dr. Sohinee Bhattacharya sohinee.bhattacharya_at_abdn
.ac.uk
2
Overview of Lecture
  • Random vs Systematic Error
  • What is Bias?
  • Types of Bias
  • Bias in Relation to Study Design
  • Effect of Bias on Results
  • Eliminating bias

3
News of the Week.
4
Error
  • An error is by definition an act, an assertion,
    or a belief that deviates from what is right..but
    what is right?
  • The true length of a metre is arbitrarily decided
    by agreeing a definition
  • The difference between a "correct" metre stick
    and an erroneous one can be accurately measured
  • For health and disease the truth is usually
    unknown and cannot be defined in the way we
    define metre
  • Error should be considered as an inevitable and
    important part of human endeavor
  • Popperian view is that science progresses by the
    rejection of hypotheses (by falsification) rather
    than the establishing of so called truths (by
    verification)

5
Figure 4.1
(b) Error is unequal in one of these groups
leading to a false interpretation of the pattern
of disease - here failure to detect differences
(a) Error is unequal in one of these groups
leading to a false interpretation of the pattern
of disease - falsely detecting differences
6
Error
  • Due to Chance
  • Random error type Ireject null in sample that
    is true in population (control with p value and
    CI)
  • Due to Bias
  • Systematic error (control in design, estimate
    effect size and direction in analysis phase)

7
RANDOM ERROR
8
Random Error
  • Random deviation from the truth
  • Incorrect assessment of exposure / outcome
  • Continuous Incorrect measurement
  • Binary / Categorical Incorrect categorisation
  • May result from
  • Poor instruments / tests
  • Data-entry error
  • Subject error

9
Random Error
  • Also known as
  • Random misclassification
  • Random noise
  • Decreases likelihood of observing an effect
  • Bias findings towards the null
  • Increases likelihood of Type 2 error (falsely
    accepting H0)
  • Serves to underestimate any association

10
NON-RANDOM ERRORBIAS
11
Bias Websters Definitions
  • A line diagonal to the grain of a fabric
  • Highly personal and unreasoned distortion of
    judgment
  • Systematic error introduced encouraging one
    outcome over others

12
Bias
  • A preference or an inclination
  • Bias may be intentional or unintentional
  • In statistics a bias is an error caused by
    systematically favoring some outcomes over others
  • Bias in epidemiology can be conceptualised as
    error which applies unequally to comparison
    groups.

13
Bias
  • Sources of variation that distort the study
    findings in one direction
  • Is a spurious association or effect on an outcome
    in a particular study
  • Is a systematic error
  • biased measurement vs biased results

14
A Biased Research Question
  • Are girls cleverer than boys?
  • The right question here would be that there are
    no gender differences in intelligence
  • The underlying values of the researchers may be
    that girls are more intelligent than boys
  • Likely to be revealed at the analysis and
    interpretation stage by biased interpretation
  • It is problematic to describe difference without
    conveying a sense of superiority and inferiority

15
The research question
  • Syphilis Study of the US Public Health Service
    followed up 600 African American men for some 40
    years
  • The question does syphilis have different and,
    particularly, less serious outcomes in African
    Americans than European origin Americans?
  • Investigators denied the study subjects treatment
    even when it was available and curative
    (penicillin)

16
Bias
  • Systematic deviation from the truth
  • Can increase or decrease an effect estimate
  • Study design can (help to) eliminate
  • Can be a problem, but may be useful
  • Can be investigated
  • Size of increase or decrease can be estimated

17
Types of Bias
  • Selection bias
  • Concerned about who is in your study
  • Information bias
  • Concerned about the information you elicit from
    your subjects

18
Selection bias
  • Selection bias is inevitable, simply because
    investigators need to make choices
  • Captive populations are popular-some may be
    representative, e.g. schoolchildren, others not
    at all, e.g. university students
  • People are also missed either inadvertently or
    because they actively do not participate
  • Selection bias matters much more in epidemiology
    than in biologically based medical sciences.
  • Biological factors are usually generalisable
    between individuals and populations, so there is
    a prior presumption of generalisability
  • If an anatomist describes the presence of a
    particular muscle, or cell type, based on one
    human being it is likely to be present in all
    human beings (and possibly all mammals)

19
Figure 4.2
  • Ignoring populations
  • Questions harming one population
  • Measuring unequally
  • Generalising
  • from unrepresentative populations

Study population
Ignored population
Comparison population
20
Selection Bias
  • Often occurs due to failure of intended sample to
    be representative of target population
  • Hospitalized patients not representative of full
    range of patients with disease
  • Randomized patients selected by response to
    therapy
  • Protocol failures

21
Sources of Selection Bias
  • Volunteer Bias
  • Referral Bias
  • Healthy worker effect
  • Non-participation bias

22
Volunteer Bias High Grades
23
Referral Bias
  • To examine mortality from cardiovascular disease
  • Death reports from cardiovascular disease in 2
    tertiary referral hospitals

24
Healthy Worker Effect
  • Typically, but not exclusively, in occupational
    environments
  • Example
  • Case-control study

25
Non-participation Bias
  • Most frequent cause for concern in large-scale
    epidemiological surveys
  • Non-participation
  • Loss to follow-up
  • Systematic differences between participants and
    non-participants
  • With respect to the relationship under
    examination
  • Typical non-participants
  • Young / Male / Ethnic minorities

26
Types of Bias
  • Selection bias
  • Concerned about who is in your study
  • Information bias
  • Concerned about the information you elicit from
    your subjects

27
Information Bias
  • What information are you getting from subjects?
  • Concern
  • Are there systematic differences in what is being
    collected, between study groups?
  • Does each subject have an equal chance of
    providing the same information?
  • Sources
  • Observer bias
  • Recall bias

28
Observer Bias
  • Interviewer knowledge may influence structure of
    questions
  • Preconceived expectations of study outcome
  • Study methods may change over time
  • Different investigators may examine different
    subjects
  • Times / locations of interviews may vary

29
Attention Bias
  • Hawthorne Effect
  • An increase in worker productivity produced by
    the psychological stimulus of being singled out
    and made to feel important
  • People may respond differently if they think they
    know what is being studied
  • Potential effect
  • ?? prevalence of disease
  • ?? relationship under examination

30
Surveillance Bias
  • Can arise if one group is over-researched, in
    comparison with the other
  • Case-control study
  • Tendency to examine more closely those with
    outcome of interest
  • Association alcohol consumption vs oropharyngeal
    cancer
  • Cohort study / Randomised trial
  • Tendency to follow more closely (or for longer)
    those with exposure of interest
  • Association CBT vs low back pain

31
Recall Bias
  • Major concern where exposure data measured
    retrospectively
  • Case-control studies (including case-control
    analysis of cross-sectional survey)
  • Concern
  • Differential recall between cases and controls

32
Recall Bias
33
(No Transcript)
34
Epidemiological Study Design
Assess exposure and outcome
35
Bias in Case-Control Studies
  • Study design most susceptible to bias
  • Separate sampling of cases and controls
  • Retrospective measurement of predictors
  • Selection
  • Cases take all
  • Controls similar in all respects to cases except
    in terms of disease occurrence
  • Hospital vs community controls

36
Bias in Case-Control Studies
  • Ascertainment
  • Differential measurement in predictors
  • Recall bias (use data recorded before outcome)
  • Blinding

37
Bias in Cohort Studies
  • Bias in cohort studies
  • Loss to follow up or attrition bias
  • Following up newborn infants (Victora, 1987)
  • The loss to follow up differed systematically by
    socioeconomic status

38
Bias in RCTBias in RCT
  • Selective failure to receive intervention
  • - adhere to protocol
  • - get follow-up
  • Control with
  • blinding is as important as randomization
  • investigator, subjects and all others
  • randomization controls for unknown
    confounders (also)
  • analysis intention to treat
  • design to prevent drop outs run in period

39
Other Types of Bias
  • Investigator
  • Conflicts of interest
  • Publication
  • Studies not representative
  • Spectrum
  • Relevant to research on medical tests
  • Inappropriate range of tests results or disease
  • Range of disease not representative of target
    population
  • Difference of advanced disease vs normals

40
Spurious Associations from Bias
  • Difference between research question and question
    actually asked
  • For population and samples
  • Target population all adults
  • Intended sample patients in clinic
  • For Phenomena of interest
  • Cause coffee consumption
  • Variable self reported coffee consumption

41
Spurious Associations from Bias
  • In outcome
  • Effect myocardial infarction
  • Outcome diagnosis of infarction from claims
    data

42
Preventing Bias
  • Design phase problem
  • Dont introduce bias in analysis phase by
    attempting to correct for confounding or other
    problems
  • In design phase assess direction (occasionally
    magnitude of effect) by use of outside
    information
  • Eg positive effect measure found, bias toward
    null, minimal problem

43
Selection Bias What To Do
  • Prevent
  • Minimise
  • Estimate effect
  • What effect(s) might it have had on your study?
  • How might this change the results?
  • Does this change the conclusions?

44
Minimising Selection Bias
  • Be aware
  • Potential sources of selection bias
  • Equal opportunity for participation and follow-up
  • Cases / Controls
  • Exposed / Unexposed groups
  • Intervention / Control groups
  • Tactics for high participation / follow-up rates
  • Reminders / Postcards / Phone calls

45
Assessing Selection Bias
  • Demographic approach / Alternative data
  • What information is available on your
    non-responders?
  • Where did you get sample from?
  • Can you examine response by age / sex /
    occupation / etc?
  • Examine reluctant responders

46
Estimating Effect of Selection Bias
  • Study to estimate the prevalence of asthma in
    schoolchildren
  • Subjects children aged 5 8 yrs
  • Response rate 60
  • What is the potential effect of non-response
    bias?
  • Depends on characteristics of non-responders

47
Minimising Observer Bias
  • Standardised techniques / instruments / etc
  • Thorough training of data collection staff
  • Test agreement between interviewers / instruments
  • Use objective measurements where possible
  • Where possible, researchers should be
  • Randomly allocated to subjects
  • Blind to study question
  • Blind to case / control status

48
Minimising Attention Bias
  • Mask true study question from participants
  • Ethics
  • Informed consent
  • Health study
  • Collect information about several outcomes
  • Difficult in a case-control study
  • Collect information about several exposures
  • Ensure anonymity

49
Minimising Surveillance Bias
  • Ensure identical methodological procedures for
    all study participants
  • Where possible, blind researchers
  • To study question
  • To case / control status
  • To exposure / non-exposure status
  • To treatment / non-treatment group

50
Minimising Recall Bias
  • Minimise period of recall (if possible)
  • Measure exposure data objectively
  • Medical notes
  • Third-party verification of exposure information
  • Triangulation of measurements
  • Can you conduct a prospective study?

51
SUMMARY
52
Summary Bias
  • Concerned with the internal validity of a study
  • i.e. the extent to which, within the subjects
    studied, the results are true
  • Deviation of results, or inferences, from the
    truth
  • Or, processes leading to such deviation
  • Results from some aspect(s) of study design or
    conduct

53
Bias Summary
  • Increase / Decrease likelihood of observing an
    effect
  • Bias findings towards / away from the null
  • Increases likelihood of Type 1 error (falsely
    rejecting H0)
  • Increases likelihood of Type 2 error (falsely
    accepting H0)
  • Serves to over- / under-estimate any association

54
Bias Summary
  • Can be prevented by design
  • Can be estimated
  • Cannot be overcome by analysis
  • May be useful
Write a Comment
User Comments (0)
About PowerShow.com