Forum Class Engagement Metrics

I led the definition and design of a student engagement dashboard that instructors loved and saved hundreds of hours of work.

2 months

Minerva Project

Minerva Project

1 engineer, 1 academic SME

1 engineer, 1 academic SME

Class Engagement Metrics

I led the definition and design of a student engagement dashboard that instructors loved and saved hundreds of hours of work.

2 months

Minerva Project

1 engineer, 1 academic SME

Intro

Forum is a a science-based online learning platform, used by universities and institutions around the world to create, deliver, and assess pedagogically-sound curriculum. 

As our partner base grew, I saw an opportunity to turn Forum’s rich classroom data into actionable insights that support our partners in a scalable manner.

The Challenge

The Customer Success team was struggling to keep up with the growing demand for partner engagement and success reports. These manually created, data-heavy documents were produced multiple times per semester for each partner. They took significant time and effort to create, yet customers didn't find them very useful.

At the same time, there was growing pressure to make better use of Forum’s classroom data. Forum already tracked participation details like:

  • Student and instructor talk time 

  • Chat and reaction activity

  • Attendance and breakout participation

The Talk Time view can be displayed to the instructor at anytime during class, and is one of Forum's most loved features.

The Talk Time view can be displayed to the instructor at anytime during class, and is one of Forum's most loved features.

This data was either inaccessible or lived in backend databases and third-party analytics tools that most people couldn’t use or even access. It was clear there was both a scalability issue and a strategic opportunity.

Problem Statement

How might we make engagement data more useful and accessible to partners, while still aligning with our pedagogical standards?

Discovery

I kicked off the project by meeting with customer-facing stakeholders and instructors to understand how current reports were built, what partners actually valued in them, and how we might provide insights to instructors in a more timely, easy to consume manner.

It was clear that the effort required to produce them wasn't scalable, but the reports gave me key user goals and needs to explore and validate.

Across the board, people wanted to answer questions like:

  • How is this program or course doing overall?

  • Are any particular sections struggling?

  • Which students might be at risk—and how soon can we know?

These questions pointed to a need for insights over time at the program, course, section, and class level.

Defining the Vision

One major point of conceptual tension was how to define and interpret “engagement.” Instructors consistently wanted some kind of engagement indicator—but our academic SME (a math expert) was understandably cautious about oversimplifying what’s actually quite complex. Talk time, for example, varies by teaching style and class format. What’s “good” in one context might be problematic in another.

Sketches exploring different ways to display key engagement data.

I facilitated several iterative workshops to:

  • Identify and discuss the scope of data we had access to

  • Explore what we could confidently infer from the data and how we could responsibly surface it in a way that is pedagogically sound 

  • Clarify where we might unintentionally mislead

  • Establish a shared foundation for future improvements

We aligned on a long-term goal — to enable partners to reflect more meaningfully, and identify & support disengaged students more effectively.

Concept Testing

When we had a scope of potential data and ideas, I created two dashboard concepts that presented data in different ways, and tested them with 14 instructors and administrators.

Two different directions in which engagement data could be visualized, using components from the reports, as well as new ideas.

Key insights included: 

  • Talk time, chat, and polls were identified as key engagement metrics, both for users and pedagogically

  • There was some concern about how this info could be used or interpreted in unintended ways, so we needed to make sure this was done carefully

  • To accommodate both admin and instructor needs, the solution needed to allow for viewing at multiple levels, from a program to a specific class

Final Designs

I worked closely with stakeholders to collect feedback and refine the details as I worked towards the final designs.

The final dashboard provides instructors with:

  • 3 key metrics: student talk time, instructor talk time, and attendance

  • Additional metrics such as polls, chat, breakouts, and reactions

  • Student talk time history for the last four classes

The final design used color to indicate engagement health in an easily glanceable way.

Impact

Instructors have already starting to use this data to better inform their decisions during class. Additional impact from this work includes:

  • 400+ hours saved annually by eliminating the need for most manual reports

  • Partner alignment with Minerva’s internal academic team, whose approval was critical due to their high pedagogical standards

  • Strategic enablement: With the dashboard in place, Partner Success teams were freed up for demos, support, and upsell conversations

"These are super helpful for personal reflection and faculty reviews!"

"The engagement chart is everything I've ever wanted."

"The engagement chart is everything I've ever wanted."

Feedback from instructors

Looking Ahead

We successfully launched a clear, scoped MVP—but we also had a roadmap for expanding the dashboard. Future phases would deepen insights using AI, replace remaining reports, and work toward a more nuanced, data-informed definition of engagement.

© Cody Morrow 2025

Connect:

© Cody Morrow 2025

Connect:

© Cody Morrow 2025

Two different directions in which engagement data could be visualized, using components from the reports, as well as new ideas

Sketches exploring different ways to display key engagement data