Proposal Defense - PowerPoint PPT Presentation

1 / 44
About This Presentation
Title:

Proposal Defense

Description:

(Rocco, et al, 2003) ... reflected in the trait and not in the method' (Rocco, et al, 2003) ... (Rocco, et al, 2003) (Johnson & Onwuegbuzie, 2004) Concurrent ... – PowerPoint PPT presentation

Number of Views:6920
Avg rating:3.0/5.0
Slides: 45
Provided by: seanl7
Category:

less

Transcript and Presenter's Notes

Title: Proposal Defense


1
Proposal Defense
PERCEPTION AND PRACTICE A TRIANGULATION STUDY OF
FIVE IDENTIFIED TRAITS RELATED TO SUCCESS IN
DELAWARE AND MARYLAND PUBLIC HIGH SCHOOLS
  • Sean M. Lennon

2
Research Action Committee
  • Members
  • Dr. Calo
  • Dr. Costello
  • Dr. Miller
  • Dr. Wray
  • Deans Representative
  • Chair
  • Dr. Lane
  • Acknowledgements
  • Dr. Gong
  • Dr. Savini

3
Introduction to the Study
  • To determine possible traits that relate to
    success for a public high school in Delaware and
    a public high school in Maryland
  • Define possible traits that may, or may not lead
    to success
  • Determine if literature, defining possible traits
    or concepts relates to success

4
Background of the Problem
  • Accountability
  • Mandates and Reforms
  • Knowledge being necessary to good government and
    the happiness of mankind, schools and the means
    of education shall forever be encouraged
  • Time frame of American public education
  • Public school system originally controlled by
    local community with specific needs and wants
  • Intrusiveness of federal oversight meets local
    resistance
  • Federal and state oversight increase in dominance
    or control over local school systems
  • State and Local grievances

(Thorpe, 1909, p. 957)
5
Background of the Problem
  • Defining Success
  • Definitions and expectations differ within select
    populations and groups
  • Behaviorist/Holistic Approach
  • The process or proof of learning
  • Prevalent among educators
  • Lack of accountability
  • Assessment Approach
  • Utilization of assessment driven data
  • Prevalent among policy makers, general populace
  • Assessments limited to specific indicators
  • Tests not designed or utilized to specific
    purpose or population

(Kochan, et al, 1996)
(Cheng, 1997)
(Popham, 2002)
6
Successful Schools
  • Defined by No Child Left Behind
  • Assessment Approach
  • Established set parameters for
  • Testing benchmarks
  • School performance
  • Growth
  • Demographic/group performance
  • School accountability
  • AYP and AMO

(Delaware Department of Education, 2005)
7
Need For the Study
  • Define success
  • To meet testing (new) benchmarks
  • Delineate traits that relate to success
  • Determine traits that may have no relation to
    success
  • Utilize modern statistical methodologies
  • Study
  • Combines delimited traits with the new testing
    accountability standards while utilizing mixed
    methods research techniques
  • The utilization of this methodology to traits
    associated with assessments appears to be unique
  • Internet search yielded no correlations

8
Purpose of This Study
  • To sample two successful schools
  • Determined through published test data
  • State web sites
  • Delaware Dept. of Education
  • Maryland Dept of Education
  • Utilizing
  • Separate school systems
  • Separate states
  • Separate accountability systems
  • Isolate common practices, traits and/or
    ideologies
  • Find commonalities related to success
  • Confer these commonalities to the general
    educational system

9
Significance of This Study
  • Better implement and meet assessment and
    accountability programs
  • Federal
  • State
  • To determine best practices
  • Assist schools with limited resources and time to
    focus on what works
  • Delineate the published data
  • Find possible traits/practices for all public
    (American) high schools
  • To highlight local successful schools

10
Educational Leadership
  • Leadership
  • Educational Leadership
  • Usually defined in terms of goal attainment or
    accountability
  • Federally mandated criteria
  • State criteria
  • Local criteria

a process whereby an individual influences a
group of individuals to achieve a common goal
(Northouse, 2004, p. 61)
the process of persuasion by which a group is
induced to pursue objectives
(Senge, 2000, p. 14)
11
Educational Leadership
  • Personal Experience
  • Teacher with personal experience with
    accountability
  • Experienced with the nuances of the classroom and
    school environment beneficial/detrimental to
    success
  • Relationship of leadership to this study
  • Determine what works or not in leading a
    successful school
  • Determine what success is, as defined by
    different constituents
  • Determine influential traits in relation to
    limited or dwindling resources
  • Determine the extant, history and implications of
    the accountability movement and assessments
  • Learn how to be a successful educational leader

12
Indicators of Success
  • cumulative trait chart. doc
  • Delineated 5 traits from literature
  • Collaborative work
  • High expectations
  • Professional Development
  • School culture
  • Strong leadership
  • Frequency chart
  • Utilized literature containing multiple
    indicators
  • Not limited to educational
  • Counted the most frequencies, or times a trait
    was mentioned
  • Culture was defined by component parts as
    described by Deal Peterson (1999) than weighted
    as 0.5 or ½ a point

13
Sample Frame
  • A sampling frame is the list or quasi list of
    elements from which a probability sample is
    selected.
  • Sample Frame needs to include
  • Public high schools
  • 2 states
  • Deemed successful or high performing
  • Faculty from two selected schools
  • Administration from two selected schools
  • Faculty and administration of a successful public
    high school in Delaware and a successful public
    high school in Maryland

(Babbie, 2002)
14
American Public Education System
Delaware
Maryland
15
Delaware Public School System
Sussex Technical School District Sussex
Technical High School
Maryland Public School System
Talbot County Public Schools Easton High
School
16
Sample Design
  • Nonprobability
  • Specific constructs
  • Convenience
  • Schools near each other
  • Schools near the researcher
  • Purposive
  • It can be appropriate to select a sample on the
    basis of knowledge of a population, its elements
    and the purpose of the study
  • Public high school faculty and administrators
  • Faculty and administration of Easton High School
    (MD) and Sussex Technical High School (DE)

(Babbie, 2002, p. 178)
17
Delmarva (Southern Delaware) High Schools
(Delaware Department of Education, 2005)
18
Delmarva (Maryland) High Schools
(Maryland Report Card, 2005)
19
Easton High School
Easton High Web Page
(Public School Review, 2005)
20
Sussex Technical High School
Sussex Technical Web Page
(Public School Review, 2005)
21
Limitations/Delimits
  • Limitations
  • Traits may not infer to other schools
  • Ambiguity in defining/assessing success
  • Bias
  • Specific demographics to the localized region
  • Limited demographic variances in target schools
  • Delimitations
  • Limit to two schools
  • Quantitative sample group limited to 100
    respondents
  • Qualitative sample group limited to 4 respondents
  • Traits, or indicators limited to 5 variable
  • Instrument limited to less than 4 questions per
    domain/variable
  • Qualitative questionnaire limited to 1 central
    question and f sub-questions

22
Research Design
  • Mixed Method
  • Suggested/conceptualized in 1959
  • Not accepted until the 1980s
  • Utilization of qualitative and quantitative
    concepts and methodologies
  • The third wave or third research movement
  • Recent increase in use or popularity is due to
    its pluralism or eclecticism, which frequently
    results in superior research
  • Utilizes two or more modes of inquiry
  • Garner a higher level of validity or
    corroboration
  • To limit or find fault possibly missed in a
    monomethod study

(Rocco, et al, 2003)
(Johnson Onwuegbuzie, 2004, p. 17)
(Johnson Onwuegbuzie, 2004, p. 14)
23
Mixed Method becomes more accepted in Social
Research
24
(Rocco, et al, 2003)
(Johnson Onwuegbuzie, 2004)
25
Concurrent Triangulation Method
  • John Creswell
  • Earliest proponent of mixed method research
  • Considered the sage of this movement
  • One of the most common of the mixed method
    approaches
  • Uses inherent strengths of each paradigm to
    cancel out their weaknesses
  • Triangulation can note the convergence of the
    findings as a way to strengthen the knowledge
    claims of the study or to explain any lack of
    convergence that may result
  • Details
  • Utilizes both qualitative and quantitative
    methods
  • Done simultaneously
  • Each component corroborated against the other for
    convergence or collaboration

(Creswell, Trout Barbuto, 2002)
(Creswell, 2003, p. 217)
26
Research Design
(Creswell, 2003)
27
Research Question
  • Question
  • Are there commonalities, or common attributes for
    success, in two different, geographically
    separated, high achieving public schools that can
    be inferred to the general public high school
    population?
  • Sub-questions
  • Is collaborative work an indicator of success and
    a common attribute in successful schools?
  • School culture?
  • High expectations?
  • Professional development?
  • Strong leadership?

28
Independent Variables
  • Variables that (probably) cause, influence or
    affect outcomes
  • Variables controlled by the experimenter
  • Independent variables are the five indicators or
    domains
  • Collaborative work
  • School culture
  • High expectations
  • Professional development
  • Strong leadership
  • Discrete variables
  • Variables that take on a small set of possible
    values

(Cresswell, 2003, p. 94)
(Howell, 2004, p. 22)
(Howell, 2004, p. 22)
29
Dependent Variable
  • Success is the dependent variable
  • Defined through the assessment approach
  • Federal standards (NCLB)
  • Continued growth (AYP)
  • Delineated to specific demographic groups meeting
    success (AMO)
  • Safe Harbor utilize other criterion standards
    to meet AYP
  • 100 success rate by 2014
  • State standards
  • Delaware Student Testing Program
  • Compares districts and schools
  • Offers dual diplomas
  • Maryland High School Assessment
  • Compares districts and schools
  • Required successful completion for graduation
    (class of 2009)
  • Both schools have met AYP in the previous testing
    block and are in the top tier for state testing

(U.S. Department of Education, n.d.)
(Delaware Department of Education, 2003)
(Maryland State Department of Education, 2003)
30
Instrumentation
  • Qualitative
  • Questionnaire
  • Central Question
  • What makes your school successful?
  • 5 sub-questions pertaining to each domain
  • Utilizes a frequency chart
  • Content coding analysis
  • Imbedded Likert Scales (5 point)
  • Quantitative
  • Survey
  • 5 Domains
  • Likert Scaled
  • 5 point scale
  • 3 to 4 questions per domain
  • Combined
  • Both instruments utilize same statistical scaling
  • Testing
  • Triangulation

31
Table 6 Instrument Question Pilot Study Subject
M/F Certification
Experience. Ethnicity 1 M
SS 5 years
AA 2 M SS
5 -10 years
Caucasian 3 M English 20
years Caucasian 4 F
FL 10 years
Caucasian 5 F Science
10 years Caucasian 6 F
LS 20 years
Caucasian 7 M
SS 20 years Caucasian 8
M SS
10 years Caucasian 9 M
CTE 5 years AA 10
F FL
5 years Hispanic 11
F English 20 years
Caucasian 12 F SS
20 years
Caucasian 13 F CTE
20 years AA 14 M
NS 5 -10 years
Caucasian 15 F Math
20 years Caucasian 16 M
Math 5 -10
years Caucasian 17 F
Sp. Ed. New
Caucasian 18 M CTE
New
Caucasian 19 M H./P.E.
20 years Caucasian 20 F
FL. 20
years Caucasian 21 F
Math 5-10 years
Caucasian
32
Reliability/Validity
  • Reliability
  • The consistency or reproducibility of test
    scores
  • Determined differently through qualitative and
    quantitative methodologies
  • Qualitative is produced by the consistency of the
    interviewer through repeated measures
  • Quantitative is produced with the consistency of
    the instrument and its questions through repeated
    responses
  • Validity
  • Qualitative and Quantitative methodologies
    conceptualize validity differently
  • Qualitative
  • Not a fixed construct or concept
  • A continual, rigorous process of the research
    design
  • Quantitative
  • Validity refers to the extent to which an
    empirical measure adequately reflects the real
    meaning of the concept under consideration
  • Usually refers to the instrument

(Fairchild, n.d., p. 3)
(Weber, 1990)
(Garson, n.d.)
(Morse, et al, 2002)
(Babbie, 2002)
33
Reliability
Quantitative Survey Instrument Cronbachs Alpha ?
(Internal Consistency) by Domain Domain of
Questions ? score Collaborative
Work 4 .822 Culture
3 .772 High Expectations
3 .719
Leadership 4 .737
Professional Development
4 .818
Performed by SYSTAT Version 9 for Windows
34
Validity
Quantitative Instrument
  • Construct Validity
  • Factor Analysis
  • Assumes the use of interval or ratio data
  • Ordinal can be used but makes interpretation of
    the results harder to define
  • Factor analysis was performed
  • Canonical Correlations were deemed too high to be
    reliable
  • Data was assumed to be compromised and removed
    from the study
  • Criterion Validity
  • Not applicable due to originality of the
    instrument and limited by time constraints of the
    research design
  • Face and Content Validity
  • Content Validity
  • 3 experts or educators possessing experience
    and/or terminal degrees were asked to asses the
    instruments
  • Dr. Sue Lester
  • Dr. Carol Visintainer
  • Dr. Pat Savini

(Garson, n.d.)
35
Qualitative Component
  • Philosophical differences between qualitative
    methodologies and quantitative methodologies
  • Reliability and validity
  • Purists argue these terms are pertinent to
    quantitative rather than qualitative
  • Substitutes have been suggested including
  • Inquiry
  • Post hoc assessments
  • Peer debriefing
  • Triangulation
  • Internal validity versus external validity
  • Internal able to confer relationship to
    variables being tested
  • Not as strict in qualitative
  • External able to infer results to other
    populations
  • No importance in qualitative
  • Reliability
  • If the principles of qualitative inquiry are
    followed, the analysis is self-correcting
  • Reliability is a process of the methodology and
    design, incorporated throughout the research

(Morse, et al, 2002)
(Winter, 2000)
(Morse, et al, 2002, p. 12)
36
Data Collection
  • Easton High School
  • Survey
  • Presented at faculty meeting
  • Researcher will read from script then direct
    respondents
  • Collect immediately
  • Questionnaire
  • Face to face interview in respondent's office or
    neutral location
  • Questionnaire collected immediately
  • Sussex Technical High School
  • Survey
  • Placed in faculty mailboxes
  • Directions will be stated through cover letter
  • Completed surveys will be collected in by
    district receptionist and placed securely in her
    office
  • Questionnaire
  • Face to face interview in respondent's office or
    neutral location
  • Questionnaire collected immediately

37
Statistical Tests
  • Inferential
  • Bivariate
  • Multiple variables
  • Discrete
  • Non-parametric
  • Sensitive (more) to medians than means
  • No assumption of the distribution
  • Controversy between ordinal or interval data?

(Howell, 2004, p. 467)
38
Ordinal v. Interval
  • Likert Scales can be argued as both
  • The level of measurement depends upon properties
    of the scale
  • of scales can also be a factor
  • If ordinal nonparametric tests are normally used
  • If interval parametric tests are normally used
  • What do you pick?

(Goldstein Hersen, 1984)
(Clason Dormody, 2005)
(Mogey, 1999)
Likert Scale
39
Advantages for nonparametric
  • Doesnt rely on normal distribution of the
    population
  • More sensitive to medians than means
  • Utilize rankings-more applicable to Likert Scales
  • More objective if rankings are ordinal
    measurements

(Howell, 2004)
(Clason Dormody, 2005)
(Dallel, 2003)
40
Disadvantages to Nonparametric
  • Can be limited in specificity
  • Lack of parameters
  • Lack Power
  • The probability of rejecting Ho
  • Statistical Power
  • note Power can be increased through sampling
    size
  • ARE (Asymptotic Relative Efficiency)
  • Controversial

(Dallel, 2003)
(Wikepedia, 2006)
(Dallal, 2003)
(Lane, 2006)
41
Differences
  • Kruskal-Wallis
  • Corresponds to the One way analysis of variance
    test (F test)
  • 3 or more samples
  • Test
  • Each question from Easton High group (1-18) to
    Sussex Technical High group (I-XVIII)
  • Mann Whitney
  • Common test
  • Corresponds to the t test (for independent
    samples)
  • 2 independent samples
  • Tests
  • Every question within each domain or trait in
    both groups (individually tested)
  • Qualitative tested to quantitative domain in each
    group

(Dallal, 2000)
(LTDI, 1999)
(Dallal, 2000)
(Howell, 2004)
42
Association
  • Spearman rank correlation coefficient
  • Spearman Rho
  • Corresponds to the Pearson correlation
    coefficient
  • Utilizes ranking not variables
  • Correlates ranked data
  • Measure of monotonic relationship
  • Line is always falling or rising though not
    always straight
  • Tests
  • Each question (1-18) from Easton High group to
    Sussex Technical High group (I-XVIII)
  • Each domain (1-5) from Easton High group to
    Sussex Technical High group (I IV)
  • Each complete sample group, including both
    qualitative and quantitative questions (Easton
    High School) to the other complete sample group
    (Sussex Technical High School)

(Dallal, 2000)
(Howell, 2004)
43
Testing Matrix
44
Closure/Supplemental Data
  • Equation Key
  • Dissertation Database
  • References
  • Closure
  • A triangulation study of a high school in
    Delaware to a high school in Maryland
  • 5 traits related to success
  • Thank you for your time
  • Questions?
Write a Comment
User Comments (0)
About PowerShow.com