Research Seminars in IT in Education MIT6003 Educational Research Design - PowerPoint PPT Presentation

1 / 53
About This Presentation
Title:

Research Seminars in IT in Education MIT6003 Educational Research Design

Description:

... proportion of males and females say they would vote for a candidate in the next ... the proportion of voters who would vote 'pro-democratic' candidates (effect) ... – PowerPoint PPT presentation

Number of Views:110
Avg rating:3.0/5.0
Slides: 54
Provided by: wind471
Category:

less

Transcript and Presenter's Notes

Title: Research Seminars in IT in Education MIT6003 Educational Research Design


1
Research Seminars inIT in Education(MIT6003)Ed
ucational Research Design
  • Dr Jacky Pow

2
Agenda
  • Identification / formulation of research problems
  • Research literature review
  • Systematic inquiry and research design
  • Approaches to research (Choice of research
    approach)

3
Formulation of research questions
  • What is the itch of doing research? 
  • Curiosity
  • Experience
  • Need for deciding or acting (solving a problem)
  • Building on or contesting existing theory
  • Available funding,
  • Job?

If you would not be forgotten, as soon as you
are dead or rotten, either write things worth
reading, or doing things worth the writing
Benjamin Franklin
4
Finding a research topic
  • Finding a suitable topic for research can be
    difficult. Starting points can be
  • Discussion with a supervisor or colleagues
  • Discussion using Internet facilities such as
    e-mail, discussion groups or mailing lists
  • Databases of current and completed research
    projects e.g., http//eric.ed.gov/
  • Further research needed sections of journal
    articles, papers, theses and dissertations in the
    subject area of your interest
  • Topic definitions in encyclopedias and literature
    reviews in the journal articles

5
Finding a research topic
  • Good research depends on
  • A clear question
  • Understanding the context of the question
  • Being able to use appropriate methods to answer
    the question
  • Knowing when to stop
  • Before thinking about how to answer a research
    question you need to be clear about the topic,
    the concepts it includes and how they are to be
    defined

6
Finding a research topic
  • In each case, you are looking for
  • Topics where there is doubt and uncertainty
  • Disputed or contradicted statements
  • Topics where evidence is incomplete, lacking,
    dated
  • Topics where evidence from a study on one
    community or group could be compared with
    evidence from an associated group
  • Topics which interest you

7
Basic ways of problem formulation
  • Practical problems in the field
  • i.e., you try to solve a practical problem in
    your particular setting
  • Literature in your specific field
  • e.g. you have found a topic that interests you
    and you want to investigate it
  • Think up a research
  • e.g. you want to understand more about a topic
    that you think is important

8
What is a research question
  • In its simplest form a research question is a
    question that a scientist wants to find the
    answer to
  • Do we need a formal definition?

9
The research question
  • You may start off with a deliberately vague
    research question and then refine it as you go
    along, adding more detail and more specificity
    with each iteration
  • In each iteration, literature review
    reflection plays an important role

10
Is the study feasible
  • Is the question answerable?
  • Resources constraints
  • tradeoffs between rigor and practicality
  • how long the research will take?
  • costs
  • Ethical constraints

"Well begun is half done" --Aristotle, quoting an
old proverb
11
Knowing the field
  • Knowing the field is a vital part of research,
    indicating that you are aware of the main
    theories, structures, debates and propositions in
    a topic area, who is actively thinking about it,
    and which organizations have an interest in it or
    responsibility for it. Without this knowledge it
    is difficult to put together a credible research
    proposal and almost impossible to carry out
    research with any hope of success

12
Research literature review
  • The purpose of the literature review is to
  • demonstrate appropriate knowledge of the field
  • show a critical awareness of previous work and
    how it relates to your work
  • help the reader to understand what the problem
    was that you explored
  • give perspective to the work, to help orient the
    reader towards the rest of the study

13
What to leave in and what to take out
  • Literature review needs to be concise and
    relevant
  • Include only those reviews that help support the
    central argument
  • Argument is the link that connects the data and
    the conclusions. Hence providing description of
    each work would not be adequate, reasoned
    organization and evaluation is a must in
    literature review

14
Sources for literature review
  • Library
  • Books
  • Journals
  • Databases (e.g., ERIC)
  • Handbooks and Encyclopedias
  • Achieves
  • Non-print materials (audios and videos)
  • Newspapers
  • Computer programs

15
Sources for literature review
  • Internet
  • WWW
  • AERA, http//www.aera.net/
  • Discussion group/forum
  • Waldorf, http//www.waldorfworld.net/waldorflist/
  • Internet journals
  • From Now on, http//www.fno.org/index.html
  • AERA SIG, http//aera-cr.asu.edu/links.html

16
Inference
  • The attempt to generalize on the basis of limited
    information
  • Impractical, in terms of time and cost, to obtain
    total knowledge about everything
  • The process to generalize should be made explicit

17
Systematic inquiry and research design
  • Your research must show a clear link between your
    research question and
  • Research method to be used (e.g., survey, case,
    action research)
  • Actual data to be collected (how, when, where,
    what kind)
  • Analysis of data (how, what)

18
The research
19
Systematic inquiry and objectivity
  • People are not objective, research can be made
    more objective by following the rules of
    objectivity
  • an open atmosphere of critical inquiry the good
    researcher is a self-critic
  • only testable statements are relevant findings
    must be replicable
  • faith in the scientific method, tempered by
    skepticism
  • belief that most natural phenomena can be
    understood (even if only in a limited and
    probabilistic manner)
  • complete honesty in the research process what
    evidence is there against your hypothesis?

20
Steps in empirical research - 1
  • Problem statement, purposes, benefits
  • What exactly do I want to find out?
  • What is a researchable problem?
  • What are the obstacles in terms of knowledge,
    data availability, time, or resources?
  • Do the benefits outweigh the costs?

21
Steps in empirical research - 2
  • Theory, assumptions, background literature
  • What does the relevant literature in the field
    indicate about this problem?
  • To which theory or conceptual framework can I
    link it?
  • What are the criticisms of this approach, or how
    does it constrain the research process?
  • What do I know for certain about this area?
  • What is the history of this problem that others
    need to know?

22
Steps in empirical research - 3
  • Variables and hypotheses
  • What will I take as given in the environment?
  • Which are the independent and which are the
    dependent variables?
  • Are there control variables?
  • Is the hypothesis specific enough to be
    researchable yet still meaningful?
  • How certain am I of the relationship(s) between
    variables?

23
Steps in empirical research - 4
  • Operational definitions and measurement
  • What is the level of aggregation?
  • What is the unit of measurement?
  • How will the research variables be measured?
  • What degree of error in the findings is
    tolerable?
  • Will other people agree with my choice of
    measurement operations?

24
Steps in empirical research - 5
  • Research design and methodology
  • What is my overall strategy for doing this
    research?
  • Will this design permit me to answer the research
    question?
  • What other possible causes of the relationship
    between the variables will be controlled for by
    this design?
  • What are the threats to internal and external
    validity?

25
Steps in empirical research - 6
  • Sampling
  • How will I choose my sample of persons or events?
  • Am I interested in representativeness?
  • If so, of whom or what, and with what degree of
    accuracy or level of confidence?

26
Steps in empirical research - 7
  • Instrumentation
  • How will I get the data I need to test my
    hypothesis?
  • What tools or devices will I use to make or
    record observations?
  • Are valid and reliable instruments available, or
    must I construct my own?

27
Steps in empirical research - 8
  • Data collection and ethical considerations
  • Are there multiple groups, time periods,
    instruments, or situations that will need to be
    coordinated as steps in the data collection
    process?
  • Will interviewers, observers, or analysts need to
    be trained?
  • What level of inter-rater reliability will I
    accept?
  • Do multiple translations pose a potential
    problem?
  • Can the data be collected and subjects' rights
    still preserved?

28
Steps in empirical research - 9
  • Data analysis
  • What combinations of analytical and statistical
    process will be applied to the data?
  • Which will allow me to accept or reject my
    hypotheses?
  • Do the finding show numerical differences, and
    are those differences important?

29
Steps in empirical research - 10
  • Conclusions, interpretations, recommendations
  • Was my initial hypothesis supported?
  • What if my findings are negative?
  • What are the implications of my findings for the
    theory base, for the background assumptions, or
    relevant literature?
  • What recommendations can I make for public
    policies or programs in this area?
  • What suggestions can I make for further research
    on this topic?

30
Steps in interpretative research
  • Interpretative research is a kind of qualitative
    research
  • The research design is more flexible than the
    empirical research
  • The data collection method including the research
    question can change over time

31
Steps in interpretative research
  • Identification of an area of research
  • Formulation of research question
  • Choice of field site and participants
  • Pilot study (collect basic data)
  • Revise research question (if necessary)
  • Data collection and verification
  • Data coding and analysis
  • Interpretation
  • Conclusion and suggestion

32
Approaches to research
  • Remember the nature of the research question
    determines what approach you should use.
  • If the approach does not fit the nature of the
    question, you will not be able to answer the
    question in a meaningful way (i.e., the data
    collected cannot be used to support your argument)

33
Types of social research questions
  • Descriptive
  • This kind of questions are basically designed to
    describe a situation or an existing phenomenon.
  • A study that seeks only to describe the
    proportion of people who hold various opinions
    are primarily descriptive in nature.
  • For instance, if we want to know the general
    opinions of HK citizens towards the work of the
    HKSAR Government, we are simply interested in
    describing the situation.

34
Types of social research questions
  • Relational
  • This kind of questions are formulated to look at
    the relationships between two or more variables.
  • A survey that compares what proportion of males
    and females say they would vote for a candidate
    in the next legislative council election is
    essentially studying the relationship between
    gender and voting preference.

35
Types of social research questions
  • Causal
  • This kind of questions are designed to determine
    whether one or more variables causes or affects
    one or more outcome variables.
  • If we did an opinion survey to try to determine
    whether 7.1 changed voter preferences, we would
    essentially be studying whether the demonstration
    (cause) changed the proportion of voters who
    would vote pro-democratic candidates (effect).

36
Types of social research questions
  • The three types of questions are cumulative
  • A relational study should first describe the
    variables (by measuring) before they can build up
    relationships between variables
  • A casual study would need to describe the cause
    and effect variables and show their relationships
    before it can establish a causal link. Causal
    studies are the most demanding

37
Evaluation research in education
  • Evaluation involves determining the worth, merit,
    or quality of an evaluation object
  • Evaluation is traditionally classified according
    to its purpose
  • Formative evaluation is used for the purpose of
    program improvement
  • Summative evaluation is used for the purpose of
    making summary judgments about a program and
    decisions to continue or discontinue the program
  • Evaluation is generally done by program
    evaluators and is focused on specific programs or
    products.

38
Evaluation research in education
  • A newer and currently popular way to classify
    evaluation is to divide it into five types
  • Needs assessment Is there a need for this type
    of program?
  • Theory assessment Is this program conceptualized
    in a way that it should work?
  • Implementation assessment Was this program
    implemented properly and according to the program
    plan?
  • Impact assessment Did this program have an
    impact on its intended targets?
  • Efficiency assessment Is this program cost
    effective?

39
Quantitative approach
  • Assumption the social phenomena can be measured
    and quantified
  • Use a branch of mathematics as a tool
    Statistics (probabilistic)
  • For example a study concluded that 80 per cent
    of the parents would use punishment rather than
    encouragement when their child failed an
    examination (an empirical study)

40
Quantitative approach
  • Quantitative the study of samples (the larger
    the samples, the more accurate the prediction and
    hence the generalization)
  • Based on the sample to predict the population
    (vigorous and systematic)
  • Statistical generalization (e.g., there is an x
    per cent or y per cent chance that ) it is
    quantitative measure

41
Qualitative approach
  • Assumption Not all the phenomena are measurable
    quantitatively or quantifiable (i.e., in terms of
    amount, intensity or frequency)
  • Human as the instrument (e.g., in-depth
    interviewing, participative observation, action
    research)
  • More flexible and fluid than the more rigid
    quantitative study

42
Qualitative approach
  • Qualitative the study of singularities (explore
    the inner experience of the individuals)
  • The study is an end of itself. We (the reader)
    determine whether the findings of the study are
    generalizable
  • fuzzy generalizations (it is possible, or very
    likely or unlikely that ) it is an
    qualitative measure

43
Qualitative approach
  • Questions change over time
  • Expressions of subjectivity and biases
  • Multiple observers, repeated interviews (cycles
    of study)
  • Constant refinement and testing
  • Making the world real to the reader to
    recognize the authenticity of the study

44
Quantitative or qualitative
  • A rule of thumb nomothetic tends to be
    quantitative and idiographic tends to be
    qualitative
  • What is the view of a village-school principals
    view towards the recent educational reform in HK
    (idiographic)
  • Whether teachers have given enough support to
    integrate IT in their teaching? (nomothetic)

45
Quantitative or qualitative
  • Quantitative
  • Numbers
  • Big sample
  • Theory testing
  • Qualitative
  • Words
  • Small sample
  • Theory building

46
Quantitative or qualitative
Grounded theory

Action research
Focus groups
Case-based research
Survey Feasible
region
Experiment
Theory buildingmeaning
Theory testing measurement
47
Quantitative or qualitative
  • Keep in mind that most researchers probably do
    not hold either approach to be completely
    correct, but, instead, fall somewhere on a
    continuum between the two extremes
  • Many researchers mix ideas from both approaches
    into their research

48
Empirical educational research
  • Outcomes as predictions
  • Fuzzy generalizations
  • Statistical
  • Outcomes as interpretations
  • Stories (narrative-analytical accounts)
  • Pictures (descriptive-analytical accounts)

49
An overview of empirical educational research
Outcomes as predictions
Fuzzy generalizations
Statistical
Studies of singularities
Studies of samples
Experiments
Survey
Case studies
Action research
Stories
Pictures
Outcomes as interpretations
Bassey (1999)
50
Choice of research approach
  • question starts with how or what
  • topic needs to be explored
  • theories need to be developed
  • requires natural setting
  • intended audience is receptive to qualitative
    analysis
  • sufficient time and resources
  • "wide-angle" lens
  • question starts with why
  • variables are easily identified
  • theories are available to provide a starting
    point
  • can be in natural setting
  • intended audience is receptive to quantitative
    analysis
  • limited time and resources
  • "zoom" lens

51
Appropriate research approach
  • The researcher should aim to achieve the
    situation where blending qualitative and
    quantitative method of research can produce a
    final product which can highlight the significant
    contributions of both, where qualitative data can
    support and explicate the meaning of quantitative
    research. (Nau,1995)

52
Class activity
  • Create a concept map of educational research
  • Compare your concept map with other group members

53
End of this lecture
Write a Comment
User Comments (0)
About PowerShow.com