Faculty Working Groups

Faculty working groups are an important part of helping us evaluate learning technologies to fill gaps as they arise. Instructors in working groups collaboratively develop core criteria to compare the available tools and provide recommendations on which technologies should receive central funding and support.

If you are interested in being involved in any current or future working group initiatives, please contact us at the LT Hub. We welcome new faculty and student input in these groups.

 Current Groups

The Enterprise Video Platform (EVP) Working Group is composed of faculty, students, and instructional support staff. The group's purpose is to collect the broader needs of the UBC academic community for a video platform. The group plans to meet 3-4 times during the summer and fall of 2023. Through these meetings, they will develop comprehensive surveys to capture a range of perspectives and insights from faculty and students.

In the fall, the group will also analyze the data collected from the surveys to inform requirements for the EVP request for proposal. These requirements will help guide the selection of the video platform that best meets the needs of the university.

To join the working group, you can contact Lucas Wright (Learning Strategist for EVP).


 Past Groups & Outcomes

Plagiarism Tools Working Group

The Plagiarism Tools Working Group's purpose was to evaluate the range of tools available for checking written work for plagiarism. Contract expiration and ongoing privacy challenges with UBC’s centrally-supported Turnitin plagiarism tool had surfaced a need to investigate possible alternatives. The project determined the pedagogical and functional practices that were useful to support, and members recommended what tool(s) would be preferred.

After several discussions and demos, the working group of faculty, students, and staff unanimously agreed that Turnitin still offered the best tool in this space.

  • Turnitin had improved its privacy compliance to better abide by FIPPA and potentially support better integration with Canvas.
  • Compared to other similar tools, Turnitin had the most flexible configurations for assignments.
  • Turnitin's features included the ability to score assignments directly in the application and re-use the written feedback for students.
  • Turnitin provided comprehensive options for instructor and teaching assistant settings as well as the strongest support for multilingual plagiarism-checking.
  • As a tool already in use at UBC, Turnitin offered a familiar interface for instructors and students.

Implementation and funding decisions

UBC's learning technology leadership decided to continue funding and supporting Turnitin as its primary plagiarism-prevention tool. The contract extends our current license, with the option to renew for additional years.

Student Response System Working Group

The Student Response System (SRS) Working Group’s purpose was to evaluate SRSs broadly and recommend which SRS(s) to support within UBC’s Learning Technology Environment. SRSs allows instructors to collect responses to in-class questions that students answer individually using their computer or mobile device.

As part of this process, the group compiled market scans, usage statistics for UBC iClicker and Top Hat, and evaluation processes used at other institutions; conducted an SRS literature review; developed and conducted an SRS instructor survey; and demonstrated for each other select SRSs. This combined information led to developing a core set of criteria for evaluating systems against one another.

A shortlist of systems the working group determined most viable for campus-wide use was ranked by each member. The outcome identified three systems most members thought suitable for central support:

  1. Top Hat, with a focus on supporting more advanced use cases. It seemed the most robust and flexible solution that fulfilled the highest number of criteria and offered the widest variety of question types.
  2. Poll Everywhere, with a focus on supporting simple use cases. Many members thought it felt like the easiest system to quickly learn and use, particularly for those new to SRS, and therefore could promote adoption of SRS generally. In addition, workflow and reporting features may allow for more in-depth usage.
  3. iClicker Cloud, with a focus on supporting instructors already familiar with the iClicker Classic workflow. Discontinuing use of this system when instructors are challenged with learning many new technologies seemed unfair, especially since the system met much of the core criteria well.

Overall, the working group wanted to see UBC support at least two of these systems. Providing instructors with options to choose from ensures that all pedagogical use cases will be supported and that any instructor interested in using a SRS can find one that fits.

You can read the full UBC SRS Working Group report for more information, including details of the evaluation process, instructor survey results, and group membership.

Implementation and funding decisions

UBC's learning technology leadership decided to pursue a contract with iClicker Cloud. The initial contract was a trial to get everyone through the academic year, with a reconsideration to follow or renew.

A contract with Top Hat was ruled out at the time, as leadership did not believe they could reach a satisfactory legal agreement. Similarly, investigation into Poll Everywhere highlighted significant work required in order to get it ready for privacy-compliant use.

Student Peer Assessment Tools Working Group

The Student Peer Assessment (SPA) Tools Working Group’s purpose was to evaluate SPA tools broadly and recommend which to support at UBC, both for tools that facilitate review of the peer collaboration process (e.g., students review other students’ contribution to a group) and those that facilitate review of peer products (e.g., students review other students’ work).

The group compiled market scans, UBC tool usage statistics, previous pilot and usability data, and evaluation processes used at other institutions; developed and ran a SPA tools instructor survey; and discussed or evaluated a variety of potentially suitable SPA tools. This combined information led to developing two sets of core criteria for evaluating tools against one another.

In order to expedite a recommendation ahead of online courses in fall 2020 and support the pivot to online teaching, the working group focused on a shortlist of tools, based on those that best fit the criteria extracted from the survey results and the group’s own experiences. Members explored and then rated these tools against the core criteria and collaborated on lists of pros and cons. The outcomes and further discussion resulted in a recommendation of five tools:

  • iPeer for reviewing the peer collaboration process, though this recommendation was made with the assumption further development by UBC will be supported to enhance the tool’s ease of use and bring it into alignment with the core pedagogical needs it did not meet.
  • peerScholar for reviewing peer products using traditional peer review that can be customized to suit most contexts.
  • CLAS, ComPAIR, and PeerWise for reviewing peer products for special situations (video annotation, comparative decision-making, or multiple-choice item writing), which may also help facilitate access to instructors who are new to student peer assessment.

You can read the full UBC SPA Tools Working Group report for more information, including details of the evaluation process, instructor survey results, and group membership.

Implementation and funding decisions

UBC's learning technology leadership decided to continue funding and supporting peerScholar. CLAS, ComPAIR, iPeer, and PeerWise will also continue to receive central support.

Additionally, leadership hoped to set aside developer time to support further development of iPeer, under the guidance of a new instructor working group.