Title: implementing film school assessment
1implementingfilm school assessment
- Chapman University School of Film and Television
Ken ODonnell
2the film school
- 850 of the 4,200 students on the Orange campus
- produces 233 films each academic year
- 30 full-time faculty
3the schools ambitions
- stop growing
- make better films
- graduate a few stars
- join the top tier of film schools
Bob Bassett, dean
University of Southern California
UCLA New York University
Columbia
Chapman University Florida State
University University of Texas at Austin
Boston University American Film Institute
Sundance Workshop
4current assessment instrumentsadopted fall, 2000
- knowledge-based multiple-choice test
- career-based alumni telephone interview
- skill-based juried review of creative work
5knowledge-based multiple-choice test
- 27. A finished film suitable for projection in
any movie theater is called the - a. rough cut.
- b. rushes.
- c. answer print.
- d. release print.
- e. fine cut.
- 58. The idea developed by French critics and
filmmakers in the 1950s that the director is the
controlling force in the creation of films is
known as - a. genre theory.
- b. the studio system.
- c. the rule of thirds.
- d. the 180-degree rule.
- e. the auteur theory.
- 89. The "Odessa Steps" sequence takes place in
which film? - a. Birth of a Nation
- b. The Gold Rush
- c. Citizen Kane
- d. Intolerance
6career-based alumni telephone interview
- What were some specific strengths and weaknesses
of the School of Film and Television program? - Have you submitted films to film festivals?
- Have any of your entries been awarded or
recognized? - What portion of your income is earned using the
skills you learned at Chapman Film School?
7skill-based faculty review of creative work
- the system evaluates 120 projects per year
- four faculty members judge each project, using
internally-developed scoring rubrics - faculty who supervised production of a certain
project dont assess it - results are tabulated, then made available to the
school without student or faculty names attached
8how we got here
- put the cart before the horse
- dont let people suspect right away how big the
effort will become
9implementation years
- year 1 devise multiple choice test, alumni
survey, and project questionnaires - year 2 refine procedures, reduce labor
- year 3 make results reliable
- year 4 tie to curriculum goals and syllabi
10year 1 prove feasibility
- develop multiple-choice test with faculty
- conduct alumni survey
- score finished student films
- eight projects were randomly selected
- average of 25 untrained judges per project
- process was cumbersome
- results were of negligible worth
11year 2 refine scoring rubrics
24. dialogue recording The dialogue was not
intelligible. 1 2 3 The dialogue was coherent,
but sounded unnatural. 5 6 7 The dialogue
sounded natural and dynamic. 8 9 10
5. story development The story was developed in
ways that failed to exploit its
potential. 1 2 3 The story was somewhat or
occasionally successful in exploiting its
premise. 5 6 7 The story was developed in ways
that exploited all of its dramatic, comic, and/or
emotional potential. 8 9 10
- 12. performance
- Acting was wooden, not credible.
- 2 3
- Some performances were credible, but not all.
- 5 6 7
- Performances were convincing, compelling, and
emotionally involving. - 9 10
18. camera operating Many shots had pans and/or
tilts that were not smooth, therefore distracting
the viewer from the films content. 1 2 3 On
occasion shots had bumps or bobbles in the pans
and/or tilts that were distracting
momentarily. 4 5 6 7 The camera operation was
invisible or appropriate to the film. 8 9 10
12the scoring rubrics
- length varies by project type 10-29 questions
- motion picture or television program
- broadcast journalism piece
- screenplay
- public relations or advertising campaign
- new this year scholarship apps, crit studies
- judges score 1-10
13sample rubric
14. shot design
The angles and camera moves that made up the
scenes were poorly chosen and executed, and
detracted from the effectiveness of the film. 1
2 3
The angles and camera moves were competently
chosen though only aided the intention of the
film in some scenes. 4 5 6 7
The angles and camera moves were effective in
capturing the important moments of the film. 8
9 10
14year 2 attach project assessment to awards
ceremony judging
- awards tradition dated back about five years
- it flourishes as the means to meet a range of
goals - student motivation
- recruitment for admissions and faculty searches
- development and public relations
15year 2 faculty judges
- two judges per film
- little publicity or promotion
- awards nominations were clustered around only a
few projects - conspiracy theories abounded among students
This was the year of our school-wide faculty buy
in.
16year 3 get valid results
- pre-awards season norming session
- twice the number of judges per project
- strength-of-certainty indicator
- clearer split between assessment purposes and
awards purposes - nominees versus winners
- early publicizing to students
17year 4 next steps
- integrate assessment results with ongoing
overhaul of curriculum
- use results of multiple choice test, alumni
survey, and project evaluation to analyze and
adjust course effectiveness
18bottom-up assessment
- PRO
- implementation is easier, less abstract
- takes advantage of existing infrastructure
- looks doable
- CON
- requires retrofitting outcomes to goals
- can look like steering without a rudder
19for any approach
- harness existing, school-wide, juried evaluations
of student work - align assessment with existing school goals
- consult a few experts first, then cite them often
- do anything for the sake of getting started, then
fix it - agree with naysayers
- repeat half truths that will produce results
20favorite half truths
- Its the same as grading, except
- we do each other.
- Were already doing all this, we just have to
make it clear to outsiders. - This will be easier to talk about,
- after we get some results back.
- This is just the pilot year anyway.