Student Course Evaluation Working Group Final Report

At a special joint committee meeting of the UOITFA and UOIT, the following final report and recommendations were approved. While more work remains to be done in implementing the recommendations, this concludes the activities of the working group set out in Letter of Understanding #4 Re Student Course Evaluation Working Group in the 2015-18 Tenured and Tenure-Track Collective Agreement.

Student Course Evaluation Working Group Final Report

Committee Membership:

For the University: Robert Bailey, Greg Crawford, Jaymie Koroluk, Caitlin Crompton

For the Association: Mikael Eklund, Shanti Fernando, Kamal Smimou, Christine McLaughlin


In March of 2016, the University of Ontario Institute of Technology (UOIT) and the University of Ontario Institute of Technology Faculty Association (UOITFA) ratified a Letter of Understanding on the creation of a Student Course Evaluations Working Group. This is the final report of that working group. Comprised of three representatives each from UOIT and the UOITFA, along with a staff support person from UOIT and the UOITFA, the working group was tasked with reviewing the current tool and its use, providing recommendations on amendments, if any, along with how the instrument should be used and recorded for various purposes.

The working group was given four months to complete its work, with the option of requesting further time from Joint Committee, which was ultimately required. The group met six times between June 2016 and January 2017. During this time, we covered much ground in the dense terrain of student course evaluations, or student opinion surveys. While clear differences of opinion exist, there were also many promising areas of common ground. This report aims to highlight our commonalities, while also recognizing our differences.

All working group members demonstrated a deep commitment to better understand the research, issues and best practices surrounding student survey tools, which in turn played a critical role in forging areas of common ground. The first meeting was almost entirely dedicated to a review and discussion of pre-existing studies on this topic. Given this is a prolific area of study, a full appraisal of all literature proved impractical. Working group members therefore reviewed some of the newer and/or well-known scholarly articles and literature reviews on this topic. An Association position paper outlines some of the works discussed. The UOITFA also presented its 2014 report on this subject, based on the results of a survey of its members, and presented policy statements from the Canadian Association of University Teachers (CAUT) on this topic. Finally, and perhaps most imperative to its initiatives, the working group also reviewed a lengthy HEQCO report on student course feedback surveys. The Committee also notes a pending report from OCUFA regarding “Student questionnaires on courses and teaching.”

Review of Current Tool

A review of UOIT’s current course survey tool, which has remained unchanged since the university was formed, was one of the first tasks of the working group. While there continue to be differences of opinion on the nature of specific questions, working group members established common ground on the overall structure of a revised survey. This includes three general sections: one measuring student readiness, one gauging the role of the university and learning environment, and one section assessing the role of the instructor. This format appears to be well in line with best practices as outlined by HEQCO.

The name of the tool initially proved contentious; the current “course evaluation” arguably offers greater weight as an evaluative tool. UOITFA representatives advanced “student opinion surveys” as a more appropriate designation. The use of “UOIT Student Course Feedback Survey” reflects further efforts of committee members to move towards common ground.

While agreement on the value of anonymous comments was not fully reached, some encouraging measures were realized in drafts of a revised tool, including efforts to position the comments space more positively. This includes explicit statements informing students of the repercussions of inappropriate comments and a warning that such comments could lead to the deletion of their survey and its associated scores. While such initiatives reflect a significant improvement to the current tool, it is also important to remember that even the best intentions can have unintended consequences; for example, statements such as these do not eliminate biases such as racism or sexism. As such, we must remain sensitive to the many ways in which hidden forms of discrimination can still underpin such tools.

Overall, committee members spent a great deal of time and effort discussing the current tool and possible revisions to it. The general tone of such conversations was positive and intellectually stimulating. While we did not agree on every minute detail, we did achieve some general points of consensus that, if implemented, will result in a significantly improved tool.

Use and Record

Committee members were also tasked with assessing the use of the tool, including how the instrument should be used and recorded for various purposes. One source of debate on this surrounds the value of these tools. Association representatives have advanced the view that the tool ought to be used primarily for formative purposes for faculty members. From this perspective, the tool is most useful for faculty members in self-assessing and improving their teaching strategies. Administration representatives advanced the viewpoint that these are also important tools in helping deans identify and address possible issues in the classroom. Course feedback surveys are currently used as a part of the annual review process. They are also included in the Official File, meaning they are accessible to review committee members in making career decisions, such as Tenure and Promotion Committees.

Committee members gradually moved towards consensus on the tool’s summative value. A large body of literature questions the statistical validity of such tools (see the Association position paper). For example, it is impossible for approximately half of faculty members to be at or above average. Compiled statistical data therefore may not offer the best measure of teaching performance. Discussions progressed to avoiding aggregated presentations of results.

Student comments were the subject of one full committee meeting. In the past, inappropriate comments have been flagged and removed prior to distribution to faculty members. Associated scores were not removed. The need for a clear set of criteria for the removal of comments was considered. There was also a great deal of discussion on the level to which faculty members should be made aware of and consulted on the removal of inappropriate comments. On the one hand, we do not want to expose faculty members to potentially abusive, harassing and/or discriminatory comments. On the other hand, faculty members should have the right to know about and address inappropriate comments if they so choose, particularly as it relates to contextualizing resultant low scores. Committee members considered a mechanism for faculty members to initiate the removal of inappropriate comments. It is also important to remember that biases and discrimination are not always blatant, and that these can be hidden beneath the surface of seemingly objective tools.

Committee members reached some agreement on the importance of context in assessing and understanding results. While full agreement on each element necessary for such contextualization remained elusive – for example, student perceptions of class size – there was general agreement on the importance of context. This includes factors such as response rate, student attendance and preparedness, university resources, and whether Teaching Assistants assumed any course duties. Areas of agreement on the significance of context resulted in some key recommended modifications to the current tool.

While disagreement on the value or appropriateness of the tool in career decisions remains, some progress was achieved through agreement on better guidance and training for those using the tool as a component in career decisions. The creation of a guidance document on assessing and contextualizing tool results, along with fostering a greater understanding of its limitations, is a positive step in the right direction.


The Student Course Evaluation Working Group presents the following recommendations for the approval of a special meeting of Joint Committee:

  1. That the Student Course Feedback Survey agreed to by the Student Course Evaluation Working Group replace the current course evaluations in use at UOIT.
  2. That greater guidance and education in teaching excellence and formative evaluation be facilitated, including teaching dossier support by the Office of the Provost.
  1. That a guidance document on student course feedback surveys for Deans, Faculty Review Committees, Continuing Appointment Review Committees, and Tenure and Promotion Committee members on the use of student course feedback surveys in their deliberations be created in consultation with the UOITFA. This document includes those principles as discussed by the committee.
  1. That the UOIT Course Evaluation policy be reviewed and updated in consultation with the UOITFA.
  1. That the reporting format for course feedback surveys will not include aggregate scores. Aggregate scores include professor scores, and simple averages across one or more questions, courses, Faculties or the University.
  1. That a clear process for the redaction or deletion of comments prior to reporting be formalized. If the process results in the deletion of comments, the entire survey will be deleted. Faculty members will be given the opportunity to view any redacted or deleted comments should they so choose.
  1. That a mechanism for faculty members to raise concerns about and to have inappropriate comments or the entire survey removed from their student course feedback surveys be implemented, in consultation with the Association.
  1. That a student’s entire survey be deleted if inappropriate comments that result in the deletion of their comments are made, so that both inappropriate comments and their associated scores will be removed.
  1. That student course feedback survey respondents remain identifiable to the Administration so as to facilitate follow up on inappropriate comments and for no other purposes.
  1. That the Provost’s Office offer improved resources and outreach opportunities to educate students about the merit and completion of the surveys.
  1. That student opinion surveys at UOIT be officially recognized as “Student Course Feedback Surveys.”
  1. That the new tool be active and available no later than Winter 2018 and a pilot of the new tool begin in the Fall 2017 term.

Leave a Reply

Your email address will not be published. Required fields are marked *