Evaluating Teaching and Learning, with Kevin Barry (Notre Dame)
March 15-16, 2012
Kevin Barry, Interim Director of the Kaneb Center for Teaching and Learning at the University of Notre Dame presented workshops on strategies and tools for effectively teaching and learning. He described the process that Notre Dame began several years ago to develop a more extensive evaluation of teaching for the purposes of faculty renewal, promotion and tenure. This process led to the articulation and use of a new set of criteria and a new student perception instrument, based on the work of Seldin, Chism, Feldman and others.
Missed the workshops but interested in the topic? Here is a sampling of what Kevin Barry presented. In an effort to balance (and contextualize) the impact of student surveys in the evaluation of teaching, Notre Dame defined four elements which must be addressed in teaching evaluations for tenure and promotion. Each department determines how best to gather evidence to address each.
- Course Design
- Evaluation of Student Work
- Student Perception
The Kaneb Center's website provides possible strategies for evaluating each of these elements.For example, some departments expect teaching portfolios to include syllabi with clear learning goals, examples of graded student work, and self-reflective essays. Barry cited research that questions the value of peer observation of teaching, except when using a clear rubric or set of questions and when used to provide formative evaluation (feedback to the instructor) rather than summative evaluation (evaluation for tenure, promotion or merit purposes.)
Notre Dame renamed their survey of student perceptions to make clear that students are expected to provide important feedback, but that their feedback does not constitute "evaluation," which is the job of department and university faculty colleagues.Now they call it the "Course Instructor Feedback" (CIF) form.
The CIF results are presented in a carefully designed, graphical form that clearly represents a mean score for each question, a confidence interval (purple line - to remind readers of the level of reliability of the results), and relevant comparisons based on type of course. (Below is one small image from a long and elaborate report. This is not the complete CIF result report.)
The CIF, administered on the web, includes 10 university-wide questions used for summative purposes, six more questions used to provide additional feedback (but not used for summative evaluation), and five student-designed questions whose results are reported to students along with course descriptions as they choose classes in subsequent semesters. There are also open-ended questions asking for discursive (rather than multiple-choice) feedback. Departments and individual instructors may add additional questions for their own purposes. By releasing the grades of students who complete all their feedback surveys three days earlier than other student grades, they have reached more than a 80% return rate--the same rate as with their previous paper instruments.
Kevin Barry's preliminary qualitative research on the results of this new set of evaluation tools suggest that the student feedback results--which producecomposite numerical scores that are easily compared with one another--may still be over-represented in the summative evaluation of teaching at the higher levels of the evaluation process. One possible remedy would be to use rubrics to assign numerical scores to the qualitative evaluation of Course Design, Implementation, and Student Work as well.
Several of the document resulting from Notre Dame's process are available on this website.These documents include the policies Notre Dame's faculty committee (ACPET) recommended as well as a list of the learning goals that faculty may adopt for their courses (in addition to writing their own.)
Other documents (including sample Course Instructor Feedback design report, the CIF questions and sample CIF reports to faculty members and department chairs) are available to Xavier community members by request to email@example.com