Computer Science Department Middle States Assessment - PowerPoint PPT Presentation

1 / 13
About This Presentation
Title:

Computer Science Department Middle States Assessment

Description:

Graduate Program Assessment. Peer Reviewed Publication at 3 years ... Graduate Program Assessment. Presentation at a Conference before Graduation ... – PowerPoint PPT presentation

Number of Views:31
Avg rating:3.0/5.0
Slides: 14
Provided by: jandely
Learn more at: http://www.cs.umd.edu
Category:

less

Transcript and Presenter's Notes

Title: Computer Science Department Middle States Assessment


1
Computer Science Department Middle States
Assessment
  • Computer Science has 4 programs (minor,
    bachelors, masters and doctorate) and therefore
    4 different plans in place - one for each of
    those programs
  • Assessment of some learning outcomes in each
    program are scheduled for each year
  • Some assessments in each program were scheduled
    to be done based on classes from Spring 2006
  • Most learning outcomes are assessed on a 3 year
    rotation but the more statistical ones are done
    yearly

2
Spring 2006Assessments Scheduled Performed
  • 4 assessment for the undergraduate program
  • Programming skills
  • Mathematical and Analytical Reasoning
  • Project Management and Large Scale Programming
    Skills
  • Research, Writing and Presentation Skills
  • 4 assessments for the graduate program
  • Project Development
  • Peer Reviewed Publication at 3 years
  • Peer Reviewed Publication at graduation
  • Presentation at a conference at graduation
  • A small committee created by the department chair
    to perform each of these scheduled assessment

3
Process for Each AssessmentApril December, 2006
  • Discussed the list of scheduled assessments for
    the current semester
  • Created a committee for each needed assessment
    who were asked to have reports back in the
    beginning of the following fall semester
  • Set a chair for each committee
  • Contacted the committee with assessment
    description
  • Informed committee about methods of assessment
  • Followed up with each committee to give
    additional guidance and answer questions
  • Reports filed and consolidated

4
Considerations when Selecting Committees
  • Faculty members not directly associated with that
    semester of the course
  • Somehow connected to the course in general
  • Previously taught that course
  • Taught a similar course on different level
  • Teaches the course following it in the sequence
  • Mix of faculty members from different backgrounds
  • Teams maximizing these differences
  • Maximize involvement of the faculty members of
    the department

5
Undergraduate Program AssessmentProgramming
Skills
  • Chau-Wen Tseng and Nelson Padua-Perez
  • They used projects from CMSC 131 (Computer
    Science I)
  • This assessment will rotate through the
    intro-programming sequence in subsequent years
  • Looked at two projects Company Database and
    Shape Decorator
  • Looked at project descriptions, 6 student
    implementations and supporting course materials
  • Determined students are able to proficiently use
    the Java constructs required for projects that
    are of moderate size (150-200 lines of code)
  • Suggestions for course improvement The projects
    should deemphasize string input/output and its
    formatting details. Projects should be more open
    ended
  • Suggestions for assessment improvement A larger
    sampling of student projects and more specific
    criteria for what is needed would give more
    feedback for course content.

6
Undergraduate Program AssessmentMathematical and
Analytical Reasoning
  • Bill Gasarch and Evan Golub
  • They used final exam from CMSC 250 (Discrete
    Structures)
  • Looked at one final exam question whose content
    is very important for the subsequent courses
  • Reviewed 20 exam papers chose at random in such a
    way as to represent the proportionate number of
    students who received As, Bs and Cs for the
    final course grade
  • Created their own grading criteria separate from
    what was used by the instructional staff to grade
    this question
  • Determined that 15 were Excellent or Very Good on
    this one question, 1 was Moderate, and 4 were
    poor. 75 were at least Very Good.
  • Suggestions for course improvement None given
  • Suggestions for assessment improvement A larger
    sampling of questions (2 questions that are
    different in nature instead of one) and Inclusion
    of students who did not successfully complete the
    course

7
Undergraduate Program Assessment Project
Management and Large Scale Programming Skills
  • Pete Keleher and Udaya Shankar
  • They used project from CMSC412 (Operating
    Systems)
  • Looked at one stage of development of a
    multi-part project
  • Reviewed the project description and 3 student
    implementations
  • Used the criteria of clear and well documented
    code, well-designed functions, and evidence of
    good debugging practice
  • Determined that two of the three implementations
    did well on all three criteria, the third was not
    well documented and showed less sophisticated
    debugging techniques
  • Suggestions for course improvement None given
  • Suggestions for assessment improvement A larger
    sampling of students possibly looking for more
    specific criteria since the student
    implementation is so large.

8
Undergraduate Program Assessment Research,
Writing and Presentation Skills
  • Bill Gasarch and Don Perlis
  • They used papers submitted for the CMSC Honors
    Program
  • They evaluated six papers submitted for Spring,
    2006 graduation
  • Used the criteria of originality, significance,
    and presentation
  • They created a 0-3 scale for each of these
    criteria, graded independently and added the
    scores. Then derived a scale of at least one 5,
    one 4 with possibly one 3 in the areas to be
    excellent.
  • Determined that all projects met the criteria of
    excellent on this scale.
  • Suggestions for course improvement None given
  • Suggestions for assessment improvement Possibly
    branching this assessment to determine the
    writing and research of non-honors students to
    determine the learning outcome of a larger
    population

9
Graduate Program AssessmentProject Development
  • James Reggia
  • He used a required project assigned for CMSC 726
    (Machine Learning)
  • Reviewed the project description and the student
    implementations of all projects submitted that
    semester
  • The project was to be implemented on an
    individual basis or in a team of size 2
  • There were a total of 13 submissions representing
    the 20 students in the class
  • The project required a proposal, a hypothesis and
    an application that tested the hypothesis
  • The criteria of originality, content,
    implementation effort, and report quality
  • Determined that the expectations of project
    development on these criteria was exceeded and
    gained valuable research experience also
  • Suggestions for course improvement None given
  • Suggestions for assessment improvement None noted

10
Graduate Program AssessmentPeer Reviewed
Publication at 3 years
  • Michael Hicks, Neil Spring and Jan Plane
  • They used the database collected from the
    graduate review day held each April
  • There were 29 3rd year students who were still
    active in the program in April of 2006
  • 20 of those students had at least one reviewed
    publication since entering Maryland.
  • This is a rate of 69 of those who are completing
    their third year have had at least one
    publication
  • The original assessment proposed was to find out
    what percentage had submitted an article for
    review rather than to determine how many had been
    accepted, but we did not have a way to collect
    that data directly.
  • Suggestions for assessment improvement Modify
    the assessment criteria to something that is more
    easily measured such as the percentage who have
    published in a peer reviewed venue. The goal of
    75 is probably too high for those who are just
    completing their third year if the goal is
    publication rather than submission.

11
Graduate Program AssessmentPeer Reviewed
Publication at Graduation
  • Samir Khuller, Heather Murray and Jan Plane
  • They used the data collected in a survey, during
    exit interviews, and on student web pages.
  • There were a total of 34 Ph.D. Graduates in
    Summer 2005 Spring 2006
  • 26 of those Ph.D. Graduates had one or more peer
    reviewed publications
  • This is a rate of 76 of those who are completing
    their Ph.D. program have had at least one
    publication
  • Suggestions for assessment improvement The
    method of data collection used this year was not
    the more accurate since none of the methods of
    discovery were required. The proposal is to
    insert a new question on the application for
    graduation specifically asking them to report
    refereed publications. This method should be
    more accurate since this form is required shortly
    before graduation.

12
Graduate Program AssessmentPresentation at a
Conference before Graduation
  • Samir Khuller, Heather Murray and Jan Plane
  • They used the data collected in a survey, during
    exit interviews, and on student web pages.
  • There were a total of 34 Ph.D. Graduates in
    Summer 2005 Spring 2006
  • 29 of those Ph.D. Graduates had presented at one
    or more conferences
  • This is a rate of 82 of those who are completing
    their Ph.D. program have had at least one
    conference presentation
  • Suggestions for assessment improvement The
    method of data collection used this year was not
    the more accurate since none of the methods of
    discovery were required. The proposal is to
    insert a new question on the application for
    graduation specifically asking them to report
    presentation at conferences. This method should
    be more accurate since this form is required
    shortly before graduation.

13
Lessons Learned about the Assessment Process
Itself
  • Many lessons learned that will modify how future
    assessments are conducted
  • More guidance to faculty selected for the
    committees
  • Qualitative rather than Quantitative difficult
    to compare to goals
  • Make sure there is a large enough sample size
    even if the number of criteria has to be reduced
    to make it practical
  • Most have a significant report of what they did
    but were shorter about the details of their
    assessment
  • More realistic evaluation methods
  • Wording of the learning outcome
  • Clearer specification of assessment measure
  • More specific criteria
Write a Comment
User Comments (0)
About PowerShow.com