Working Groups

In 2020, two separate instructor-led working groups formed to evaluate tools in the spaces of student response systems and student peer assessment tools. After several months of in-depth consideration, the working groups submitted recommendations for what tools UBC should support in these spaces going forward.

Jump to: Student Response System Working Group  |   Student Peer Assessment Tools Working Group


Student Response System Working Group

The Student Response System (SRS) Working Group’s purpose was to evaluate SRSs broadly and recommend which SRS(s) to support within UBC’s Learning Technology Environment. As part of this process, the group: compiled market scans, usage statistics for UBC iClicker and Top Hat, and evaluation processes used at other institutions; conducted an SRS literature review; developed and conducted an SRS instructor survey; and demonstrated for each other select SRSs. This combined information led to developing a core set of criteria for evaluating systems against one another.

A shortlist of systems the working group determined most viable for campus-wide use was ranked by each member. The outcome identified three systems most members thought suitable for central support:

  1. Top Hat, with a focus on supporting more advanced use cases. It seemed the most robust and flexible solution that fulfilled the highest number of criteria and offered the widest variety of question types.
  2. Poll Everywhere, with a focus on supporting simple use cases. Many members thought it felt like the easiest system to quickly learn and use, particularly for those new to SRS, and therefore could promote adoption of SRS generally. In addition, workflow and reporting features may allow for more in-depth usage.
  3. iClicker Cloud, with a focus on supporting instructors already familiar with the iClicker Classic workflow. Discontinuing use of this system when instructors are challenged with learning many new technologies seemed unfair, especially since the system met much of the core criteria well.

Overall, the working group wanted to see UBC support all or at least two of these systems. Providing instructors with options to choose from ensures that all pedagogical use cases will be supported and that any instructor interested in using a SRS can find one that fits.

You can read the full SRS Working Group report for more information, including details of the evaluation process, instructor survey results, and group membership.

Leadership decisions

UBC’s learning technology leadership decided to pursue a contract with iClicker Cloud for fall 2020. This contract is a 1-year trial to get everyone through the academic year, with a reconsideration to follow after the year concludes.

A contract with Top Hat was ruled out at this time, as leadership did not believe they could reach a satisfactory legal agreement in time for fall. Similarly, investigation into Poll Everywhere highlighted significant work required in order to be ready for FIPPA-compliant use.


Student Peer Assessment Tools Working Group

The Student Peer Assessment (SPA) Tools Working Group’s purpose was to evaluate SPA tools broadly and recommend which to support at UBC, both to facilitate review of the peer collaboration process (e.g., students review other students’ contribution to a group) and peer products (e.g., students review other students’ work).

The group compiled market scans, UBC tool usage statistics, previous pilot and usability data, and evaluation processes used at other institutions; developed and ran a SPA tools instructor survey; and discussed or evaluated a variety of potentially suitable SPA tools. This combined information led to developing two sets of core criteria for evaluating tools against one another.

In order to expedite a recommendation ahead of online courses in fall 2020 and support the pivot to online teaching, the working group focused on a shortlist of tools, based on those that best fit the criteria extracted from the survey results and the group’s own experiences. Members explored and then rated these tools against the core criteria and collaborated on lists of pros and cons. The outcomes and further discussion resulted in a recommendation of five tools:

  • iPeer for reviewing the peer collaboration process, though this recommendation is made with the assumption further development by UBC will be supported to enhance the tool’s ease of use and bring it into alignment with the core pedagogical needs it does not currently meet
  • peerScholar for reviewing peer products using traditional peer review that can be customized to suit most contexts
  • CLAS, ComPAIR, and PeerWise for reviewing peer products for special situations (video annotation, comparative decision making, or multiple-choice item writing), which may also help facilitate access to instructors who are new to student peer assessment

You can read the full SPA Tools Working Group report for more information, including details of the evaluation process, instructor survey results, and group membership.

Leadership decisions

UBC’s learning technology leadership decided to continue funding and supporting peerScholar into the 2020 winter terms. CLAS, ComPAIR, iPeer, and PeerWise will also continue to receive central support.

Additionally, leadership hopes to set aside developer time later in the year to support further development of iPeer, under the guidance of a new instructor working group.