It's your turn now!

At Eedi, we believe that the future of EdTech is not just in the code we write or the UX we design, but in the insights we glean from data. Today, we're excited to share insights from one of the many rapid experiments we've conducted on our platform, illuminating our path toward enhancing the learning experience.

We started out with a problem: we noticed that - in comparison to our other lesson types - students did not put in much effort after watching one of our Teach videos. We wanted to test ways to enhance engagement with the Teach video though subtle nudges. We’re committed to open science, so today we’re releasing the experiment dataset.

Get access here

The Experiment Unpacked

We explored the impact of the free-text answer box across four variations, including a baseline group and three modified approaches, each adding different levels of prompts and feedback mechanisms to engage students. This feature encouraged students to actively participate by typing their responses, rather than simply clicking a button. It’s a small change, but could it have an impact?

Incorporating the principles of behavioural science into the design of our experiment, we aimed to:

  1. Encourage effortful processing: By actively engaging students in solving problems after watching the Teach videos, we move beyond passive consumption to application, reinforcing learning through action.
  2. Reduce 'gaming of the system': By lengthening the process to complete the Teach video option, we aimed to balance the attractiveness of different lesson options, encouraging genuine engagement over shortcut-taking.

Now, let's visually represent the four experiment variants:

Variant 1: Business-as-usual

Variant 2: Introduction of an open-response box for answers, directly encouraging active engagement by typing responses.

Variant 3: A preparatory prompt before the video to prime students for active participation, combined with the open-response box.

Variant 4: Similar to Variant 3, but with added emphasis on accountability to the teacher, aiming to further increase the seriousness with which students approach the task.


What We Found

  • More than 15000 quizzes were started during the experiment, and each variant was seen over 7000 times. The quizzes are all focused on maths, but include a mix of year groups (Year 6 to 11) and topics (from Ordering Integers to Laws of Indices, and everything in between). Read more about the dataset in the supplemental materials, download via this link.
  • Engagement Boost: While the addition didn't significantly change the odds of students answering the follow-up (”Check Out”) question correctly, it led to a substantial increase in the time students dedicated to the ”your turn” video question. In the business-as-usual (control), the median time spent on the ”your turn” video question was just 4 seconds. This jumped up to 24-30 seconds for the three treatment variants. We statistically compared mean time spent for students who spent at most 10 minutes on the question and found that all experimental variants clearly beat the control, with variant 4 coming out on top.
  • Drop-off Rates: Despite the increased engagement, we observed higher drop-off rates in the quiz completion among the treatment groups (17%, in comparison to 15% for control), indicating a potential increase in cognitive load or fatigue. This highlights the importance of testing product improvements rigorously: while we achieved our goal of getting students to engage actively with the quiz, we didn’t improve completion rates. We write detailed test protocols before running any experiments with primary and guardrail hypotheses, just so we can always keep track of impact on our key metrics.
  • Encouragingly, when we calculate drop off by quiz section, looking at the proportion of students who drop off right after seeing a worksheet variant, more than 98% of them proceed to attempt the CheckOut question regardless of condition. This tells us that being exposed to an experimental variant did not necessarily reduce that student’s drive to complete the quiz section.
  • Given the nuanced results, Variant 3 emerged as our choice for broad implementation. This decision was informed by its relative performance and the qualitative feedback indicating it struck a balance between added engagement and usability.

The outcomes of our experiment may not rewrite the book on educational technology, but they add a valuable paragraph to an ongoing narrative. Each data point, each analysis brings us closer to understanding how digital platforms can better serve educational needs. More than the results themselves, it's the process of iterative experimentation that underscores our approach at Eedi.

If you are keen to read our detailed analysis report, reach out to bibi.groot@eedi.com.

Dive Into the Data

Your Invitation: We're not just sharing our journey; we're inviting you to be a part of it. Explore the dataset, challenge our findings, or build upon them. Access the dataset here.

Challenge Yourself: How can we balance deeper engagement with the need to complete lessons? We want to hear from you if you have evidence-based ideas for follow-up experiments:

  • What modifications to the free-text answer box could reduce cognitive load without sacrificing engagement?
  • How might gamification elements be integrated to maintain or increase quiz completion rates?

Looking Ahead

We tested small framing changes in this experiment, but there’s much more to explore. From experimenting with variable rewards to fostering an environment of cooperation and competition, we're on a quest to uncover strategies that significantly enhance student engagement and learning outcomes. Are there innovative approaches or ideas you believe could make a transformative impact? We invite educators, researchers, and enthusiasts to join us in this exploration. Together, we can harness the power of technology to unlock every student's potential.

Reach Out and Collaborate

If you're inspired by our work or curious about future experiments, we'd love to hear from you. Connect with us and take part in reimagining the future of education.

Acknowledgements: Funding through the LEVI programme has been instrumental in enabling us to share our research and data with the broader community.

P.S. This experiment is one of many in Eedi's ongoing series of experiments, each designed to chip away at the challenge of making online learning as effective and engaging as possible. We will be sharing more over the coming months.