Evaluating Teaching: being a Guinea Pig in SFUs Teaching and Course Evaluation Project

It is no secret that many instructors are dissatisfied by the current course evaluation system at SFU. In order to bring course evaluation into the “new millenium”, SFU has been developing a new approach known as the Teaching and Course Evaluation Project (TCEP). This past summer, the TCEP ran a proof-of-concept program to test a new evaluation system that included 14 lecture/seminar, and 4 CODE courses for a total of 1,329 student evaluations. Both Nienke van Houten and Rochelle Tucker from FHS took part in the study and shared their experiences in a debriefing session with fellow faculty members on October 24, 2013. They were joined by Corinne Pitre-Hayes from TLC who manages the TCEP project team.

Introduction to TCEP:

Corinne provided an overview of the project. The goal of the Teaching and Course Evaluation (TCE) project has been to make recommendations to govern the selection/development of an updated system for student evaluation of teaching and courses. The project utilized a pragmatic approach starting with a literature review in support of the goals of the project and consultation with other institutions. The heart of the approach was a strong focus on engaging with the SFU community. Toward the end of the project, a small scale demonstration of the emerging recommended approach was conducted to obtain pragmatic feedback from SFU instructors and students. The final report of the TCE project is scheduled to go before Senate on January 6, 2014.

The final report, and associated documentation can be found at:


Features of the new system:

On the ground experience:

Implementation of the evaluation project  occurred at two stages:  question selection, and encouraging student participation. Nienke used HSCI 100 – Human Biology as her “guinea pig” course. This is a first year breadth science course that was offered at the Surrey campus with an enrollment of 82 students. Question selection was via a straightforward web-based system. Custom multiple choice questions (MCQs) were selected from a list of options or typing questions into text boxes. Encouraging student participation was facilitated by the ability to track student response rates. In doing so, Nienke was able to remind students to complete the survey and identify individual students who hadn’t responded. She found that adding incentives to the process improved response rates. For example, by awarding all participants a “bonus point” if the class reached 80% response rates, she was able to achieve 75% response rate. This is much higher than previous evaluation response rates for the same class.

Nienke expects that the information from the TCEP will inform her teaching practice. Student responses to her custom questions revealed that the course materials are valuable, but that she needs to incorporate them into the lectures with more frequency. Data from the second open-ended question was less useful. This is probably due to poor question construction, and invokes the caveat that the responses are only as good as the question.

Both Nienke and Rochelle found that the evaluation data generated by the TCEP were more informative than reports from the current system. Reports included student demographic data and average ratings from other instructors who took part in the proof-of-concept. As result, they could compare their performance to the university average and there was more context for interpreting the data. Overall, the experience was satisfying and both instructors  are disappointed that they  have to return to the old system for the next few semesters.

Steps towards implementation:

The Senate Committee for University Teaching & Learning (SCUTL)  has prepared a comprehensive report that includes results from the proof-of-concept study that will be submitted to Senate for review in early 2014. If the recommendations in the report are approved, an implementation committee will be stablished to carry out the recommendations.

Specifically, there will be a:

1. Request for Proposal (RFP) for an online system

2. Process through the VPA office to confirm approach to policies, teaching priorities, and institution-wide questions

The hope is to be in a position to select the first few faculties and begin implementation in the fall of 2014.

General Conclusions:

Nienke van Houten