Learning Analytics

Innovation Pilot

The Learning Analytics project aims to take a data-driven approach to improving student learning. It comprises three key strands: 1) community engagement through research pilot activities, 2) development of an ethics and policy framework around the use of data to improve learning, and 3) development of the technical infrastructure to deliver actionable insights from learning data on an enterprise level. We are currently calling for topics and areas for exploration in this area.


Learning Analytics (LA), as defined for this project, involves the measurement, collection, analysis, and reporting of data about learners and their contexts with the goal of understanding and optimizing learning and the environments in which learning occurs. LA supports student learning success, resulting in:

  • Continuous improvement in the learning environment
  • Assisting in the early identification of students at risk of failure, supporting targeted and timely interventions
  • Measuring effectively the value of investments in teaching and learning transformations
  • Supporting planning of programs, courses, and infrastructure
  • Enabling the evaluation of instructional materials, courses, and programs
  • Supporting efficient and timely data-informed decision making

At a high level, the project comprises three inter-related areas of activity:

  1. Community engagement via research pilots
  2. Ethical and policy considerations
  3. Technical infrastructure and solution architecture

To facilitate the second of these areas of activity, a Learning Data Committee has been established. This high-level academic committee is charged with discussing and proposing institutional principles, policy, and practice with respect to learning data, and will advise the Learning Technology Leadership Team. Project governance also includes a Steering Committee and two working groups.


These questions were pulled in from a presentation. You can watch the full video for more details—the 3-D learning part starts around the half-hour mark.

Q: Data gathering – what sort of ethics difficulties do you encounter in collecting this student data?
A: This is a significant area of focus for the project. As an institution, we don't have well-developed principles and policies in this area, much like many other universities. We need to formulate these in order to answer questions such as: What data do we have the right to collect from students? Do they opt-in or opt-out? For what purposes are we collecting this data? How do we deal with insights that might arise from combining or aggregating data? What processes govern access to this data for teaching enhancement and/or research purposes? How long do we retain the data for? Do students have a 'right to be forgotten'? Fortunately, a number of organizations and institutions (e.g., the Open University and JISC in the UK, the University of Michigan and others) have made progress here and we can learn from their efforts.

Q: What resources will be allocated for the Learning Analytics project?
A: The project budget funds staffing to support the LA pilots, LA pipeline development, project management, communication, and data governance roles; and for purchasing LA infrastructure and services. Outside of the working group and funded positions, the two working groups have also identified sets of institutional stakeholders / interested parties to act as advisors / collaborators in support of the project.

Q: Many learning analytics questions depend on capabilities of the LMS, so the sooner we know what we have (i.e., not Blackboard) the sooner we can formulate meaningful questions.
A: On June 8, we announced that Canvas has been chosen as the replacement system for Blackboard Learn (Connect). You can find more details here.

Q: Can sessional instructors apply for the "call for questions" as a project PI?
A: At present, this call for questions is open to tenured or tenure-track faculty or 12-month lecturers from both UBC campuses.

Learning Analytics Stakeholders

Learning Analytics (LA) is a broad term that spans a broad range of activities: from instructors testing effectiveness of learning approaches, to instructors and advisors determining efficacy of particular learning interventions, to researchers asking basic questions of learning data to gain insights into individual performance or learning strategies, to institutional approaches used for program planning or reporting.

Purposes of using LA vary greatly, and stakeholder groups are diverse in their roles and interests.

Purposes and stakeholders in university learning analytics – adapted from Kay (2013)
LA tools or insights can facilitate… Example stakeholders
Learner empowerment: awareness and control of own learning strategies and performance to encourage self-regulated learning and support metacognition Learners
Monitoring and tracking for immediate decisions

  • Identifying problems early enough to intervene
  • Distinguishing students who are disengaged
  • Responsive interventions / enhancements
Individual educators
Reflection and research for recognizing long term issues

  • Insights into learning processes / performance by many learners
  • Education research
  • Attrition factors
  • Insights into individual performance
  • Socio-cultural aspects and underserved populations
Individual educators
Educational researchers

  • Course design/re-design
  • Curriculum and program planning
  • Faculty-level planning with regards to course, program offerings
  • Teaching assignments / enrolment patterns
  • Decision-making with regards to management, staffing, etc.
Individual educators
Reporting and communication among and between stakeholder groups e.g.:

  • Educators to learners
  • Institution to parents / government
  • Peer to peer