Student retention (preventing students from quitting or dropping out) is a major concern for online learning. So why do students drop out in the first place? One reason is a lack of engagement. Indeed, some of our friends over at edX previously found that students tend to stop watching lecture videos that go beyond 4 or 5 minutes.
Our team has already done an incredible job of making online lessons more engaging. In a previous post, we showed how we increased student attention span up to 20 minutes with our highly interactive lessons. So the engagement problem has largely been solved.
But while we've reduced dropout rates, a fair number of students still drop out of our interactive lessons. So we asked ourselves: what's going on here, and what we can do about it?
Now imagine you're working your way through a step-by-step, interactive lesson. You know what you're doing, getting every question right, but suddenly you hit a snag. You just can't seem to get this next question. Now you could ask for a hint, or you could try a few different approaches, but sometimes you'll just quit out of frustration.
With an in-person tutor, this wouldn't happen. A live tutor can diagnose exactly where you confusion is and help you get "unstuck." So our hypothesis is this: students drop out when they're told their answer is wrong without immediate, useful feedback.
If this is true, then we should see a direct correlation between how often a student is given specific feedback (either they got the correct answer or we address their wrong answer with a separate branch in the lesson) and how often the student will stick with the lesson and not drop out. So we combed the data from millions of students' answers to the thousands of questions from our hundreds of lessons (so yes, that's a lot of data), and here's what we found:
As we expected, there is a very strong correlation between how often students are given specific feedback regarding their answer and how often they'll stick stick around for the next part of the lesson. (For you stat-heads out there, the Pearson correlation for this graph was r = 0.572.)
This suggests an obvious way to improve retention: give students more (and better) feedback! Now that's always been our goal here at School Yourself. We regularly update our lessons so that common wrong answers are met with specific, helpful feedback. For example, in the latest version of our lesson on point-slope form, when you enter a slope with an extra minus sign or you accidentally flip the fraction (common mistakes we saw students making), you now proceed down a new path in the lesson that specifically addresses your error. Here's how the lesson map has evolved (note the change from a linear sequence to a more branched structure):
Now if we were to keep doing this, adding new branches to lessons where we accept all the wrong answers students give and provide specific feedback, that could take us forever (and we don't have that kind of time!) So the strategy we've adopted is to address the common wrong answers, prioritizing where students frequently make similar mistakes.
So how much work would it be to provide feedback to 90% of students for each question? According to the first graph, that means we could expect at least 97% retention for each question, which would be outstanding. And the answer turns out to depend on the question type. Here are the results for checkbox questions, and numerical response (in which a student can enter any fraction or decimal):
It turns out that for most questions, students are only giving a handful of answers. For both of these question types, on average, more than 90% of students give either the correct answer or the two most common wrong answers. So if our team should addresses the top two wrong answers, we should achieve 97% retention for each question. It takes our team about a day to make a lesson, and it would only take another day to update the lesson so it offers specific feedback to more than 90% of students.
So we'll continue to update our lessons, making them increasingly adaptive over time. And we won't have to pull any all-nighters to get it done!
How often do students guess their way through?
When designing online courses, like AlgebraX and GeometryX, a common cause for concern is that students can just "guess their way through" lessons and assessments. They're either totally lost, or not paying much attention, but they can still advance and get credit by brute-forcing their way.
This can be very problematic for the student (who is grinding away without really learning) and the course (which loses credibility when this type of student behavior is common). So here at School Yourself, we've been asking ourselves: 1) How often does this happen in our courses, and 2) What can we do about it?
To figure out how often students guess their way through lessons, our team developed a formula we call a student's guess index. It's a number between 0 and 1, where a greater number means the student is probably guessing. A student's guess index takes into account the question type (guessing on a multiple choice question vs. an algebraic question are handled differently), number of attempts, what students do between attempts, and a few other factors. Over time (and with lots of student data) we have found that a guess index of greater than or equal to 0.4 means a student probably guessed their way through a lesson. An index of 0.2 or 0.3 means they might have struggled through parts of the lesson, but they worked throughout without resorting to guessing. And an index of 0 or 0.1 means they confidently proceeded through the lesson.
So here's the distribution for the most recent 10,000 completed lessons:
Students appear to be guess their way through lessons 2% of the time. And we can do the same analysis for reviews (assessments):
Students seem to guess their way through reviews less than 1% of the time. It's a little comforting to know that students guess less on reviews than they do in lessons, since the reviews are what hold them accountable for their knowledge.
One or two guessing students out of a hundred may not sound too shabby. Indeed, these numbers are likely so low because 1) we prefer asking open-ended questions (with interactive graphs, algebraic expressions, numerical, or short answer) where possible, to ensure students understand what they're doing, and 2) our lessons are built around continuous, step-by-step user interaction, where the questions never get too far ahead of the students. But even though relatively few students are guessing, these are also the very students who need the most help! So now that we can precisely measure guessing behavior, what can we do to mitigate or prevent it?
First, it's helpful to know whether students are guessing at similar rates across all lessons and reviews (indicating a problem with the overall course design, question types, etc.), or if the guessing behavior is concentrated among certain lessons.
This can be very problematic for the student (who is grinding away without really learning) and the course (which loses credibility when this type of student behavior is common). So here at School Yourself, we've been asking ourselves: 1) How often does this happen in our courses, and 2) What can we do about it?
To figure out how often students guess their way through lessons, our team developed a formula we call a student's guess index. It's a number between 0 and 1, where a greater number means the student is probably guessing. A student's guess index takes into account the question type (guessing on a multiple choice question vs. an algebraic question are handled differently), number of attempts, what students do between attempts, and a few other factors. Over time (and with lots of student data) we have found that a guess index of greater than or equal to 0.4 means a student probably guessed their way through a lesson. An index of 0.2 or 0.3 means they might have struggled through parts of the lesson, but they worked throughout without resorting to guessing. And an index of 0 or 0.1 means they confidently proceeded through the lesson.
So here's the distribution for the most recent 10,000 completed lessons:
Students appear to be guess their way through lessons 2% of the time. And we can do the same analysis for reviews (assessments):
Students seem to guess their way through reviews less than 1% of the time. It's a little comforting to know that students guess less on reviews than they do in lessons, since the reviews are what hold them accountable for their knowledge.
One or two guessing students out of a hundred may not sound too shabby. Indeed, these numbers are likely so low because 1) we prefer asking open-ended questions (with interactive graphs, algebraic expressions, numerical, or short answer) where possible, to ensure students understand what they're doing, and 2) our lessons are built around continuous, step-by-step user interaction, where the questions never get too far ahead of the students. But even though relatively few students are guessing, these are also the very students who need the most help! So now that we can precisely measure guessing behavior, what can we do to mitigate or prevent it?
First, it's helpful to know whether students are guessing at similar rates across all lessons and reviews (indicating a problem with the overall course design, question types, etc.), or if the guessing behavior is concentrated among certain lessons.
Half of the lessons exhibited zero guessing behavior, And guessing is concentrated among a handful of lessons, like Ambiguous case, Multiples, and Unsolvable equations. This is great news, because we can focus our attention on these lessons, see where students are guessing the most, and update the lessons (and add new branches providing feedback for common wrong answers).
When it comes to reviews, students don't guess as much:
The majority (69%) of reviews had zero guessing. And once again, it's a handful of reviews that are the most conducive to guessing. So that's where we'll focus our efforts.
So while students guessing their way through lessons doesn't seem to be a major problem for our content, it's something we can mitigate by focusing our effects on the most guess-prone lessons and reviews.
Students prefer School Yourself over Khan Academy and other MOOCs, suggest improvements
A few weeks ago we launched surveys in AlgebraX and GeometryX, asking students what they thought of the courses, how they compared to others, and what improvements they might suggest. So far, more than 400 students have responded.
A sign that we're on the right track was a series of questions on how our courses compare with other massive open online courses (MOOCs), and with Khan Academy, a popular collection of online videos and assessments.
First up, we asked "How do AlgebraX and GeometryX compare with other massive open online courses (MOOCs) you have taken?" Here are the combined responses for the two courses:
A total of 135 students said they had never taken other MOOCs. Among those who had, 106 (39%) said they had little preference between School Yourself and other MOOCs. Among those with a preference, 97% preferred School Yourself to other MOOCs. Holy moly! Now might be a good time for other MOOCs to sit up and take notice: students prefer engaging, interactive, and adaptive content. We'll continue to push the envelope, and hopefully online learning will continue to evolve and improve in the coming years.
Here at School Yourself, we're big fans of Sal Khan and all that his team is doing for millions of learners around the world. But Khan Academy continues to rely on 10-minute videos of Sal speaking and writing, without letting students interact and make sure they're following along. Khan Academy is a great resource, but we think it can be even better if it adopts our methods of content production and student engagement.
So next we asked students "How do AlgebraX and GeometryX compare with Khan Academy?" Again, here are the combined answers for the two courses:
A total of 213 students said they had never tried Khan Academy. But students who had appeared to think more highly of Khan Academy than MOOCs. Approximately 33% of students had little preference between School Yourself and Khan Academy. But among students with a preference, 80% preferred School Yourself to Khan Academy. As with MOOCs, we hope Khan Academy evolves into a more interactive, adaptive experience. With Khan's expansive library and School Yourself's adaptive, engaging style, who knows what's possible?
Here's are a few more stats from the survey:
A sign that we're on the right track was a series of questions on how our courses compare with other massive open online courses (MOOCs), and with Khan Academy, a popular collection of online videos and assessments.
First up, we asked "How do AlgebraX and GeometryX compare with other massive open online courses (MOOCs) you have taken?" Here are the combined responses for the two courses:
A total of 135 students said they had never taken other MOOCs. Among those who had, 106 (39%) said they had little preference between School Yourself and other MOOCs. Among those with a preference, 97% preferred School Yourself to other MOOCs. Holy moly! Now might be a good time for other MOOCs to sit up and take notice: students prefer engaging, interactive, and adaptive content. We'll continue to push the envelope, and hopefully online learning will continue to evolve and improve in the coming years.
Here at School Yourself, we're big fans of Sal Khan and all that his team is doing for millions of learners around the world. But Khan Academy continues to rely on 10-minute videos of Sal speaking and writing, without letting students interact and make sure they're following along. Khan Academy is a great resource, but we think it can be even better if it adopts our methods of content production and student engagement.
So next we asked students "How do AlgebraX and GeometryX compare with Khan Academy?" Again, here are the combined answers for the two courses:
A total of 213 students said they had never tried Khan Academy. But students who had appeared to think more highly of Khan Academy than MOOCs. Approximately 33% of students had little preference between School Yourself and Khan Academy. But among students with a preference, 80% preferred School Yourself to Khan Academy. As with MOOCs, we hope Khan Academy evolves into a more interactive, adaptive experience. With Khan's expansive library and School Yourself's adaptive, engaging style, who knows what's possible?
Here's are a few more stats from the survey:
- 90% of students would recommend AlgebraX and GeometryX to others.
- Currently, course grades are based entirely on adaptive quizzes after each topic, with ~100 quizzes in each course. 52% of students said having exams (or a final) would help them learn, 14% said they didn't want exams, and the remaining 34% were neutral. In response to this, our team is looking into adding optional adaptive exams to both AlgebraX and GeometryX. Stay tuned!
- We asked how the courses could be improved, and here were the most common responses:
- Students want more lessons on real-world applications of the material. We have a few of these, like using parallel lines and alternate interior angles to estimate the size of the Earth, and how graphs of 2D inequalities are used in machine learning. But including more examples of applications is something we're always thinking about, and we hope to add many more in the near future.
- Students want more review questions. We're always adding more questions, so we'll see how students feel again in a few months. Some students also asked for more challenge questions, in particular, feeling the courses were too easy overall (77% thought the difficulty was about right, 20% thought they were too easy, and only 3% thought they were too hard). At the moment, the challenge questions are entirely optional for students. If and when edX decides to include grades on certificates, we may look into awarding additional credit to students who master the challenge questions.
Finally, we asked students what other School Yourself courses they'd like to see on edX. Here's what people said:
It looks like our team should get cracking on Algebra II, which won the vote at 82%. Of course, we hope to eventually create courses covering all of these subjects! They all build on one another -- for example, you use trigonometry in physics, and calculus in statistics. The more subjects we add to the School Yourself library, the more paths students can take through the lessons, and the more adaptive and powerful the experience becomes.
It looks like our team should get cracking on Algebra II, which won the vote at 82%. Of course, we hope to eventually create courses covering all of these subjects! They all build on one another -- for example, you use trigonometry in physics, and calculus in statistics. The more subjects we add to the School Yourself library, the more paths students can take through the lessons, and the more adaptive and powerful the experience becomes.
Will this be on the test?
If you've ever taught a math class, then you've probably encountered the following scenario: you're showing the class a cool proof (like why an inscribed angle always has half the measure of its arc), and you get asked: "Will this be on the test?" You're demonstrating how math is a consistent framework and that even the hardest facts are derived from simpler theorems. But your students just aren't interested. And from their perspective, this makes a lot of sense. Students have many demands on their time (some related to education, some not), and they're really just trying to optimize their schedules.
And so with our lessons, we tried something a little different. We know students aren't interested in working through proofs all the time. But sometimes they're genuinely curious, or they already passed the quiz and are now returning to see why the rule they memorized actually works. So we try to leave it up to the student to choose: if you want to work through a proof, click here; to just do practice problems, click here instead.
Here's an example from our lesson on inscribed angles:
After a student sees the rule in action and plays with an interactive, he or she can decide whether to work through a proof or skip ahead. If they opt for the proof, they'll work with general angles and prove it for themselves.
Otherwise, practice problems it is!
While it's nice that we give students the choice, we're really hoping they'll elect to work through the proof. They'll get more experience with geometric proofs, get more practice with algebra, and convince themselves that the rule truly works. So how many students decide to work through the proof?
Using our analytics dashboard (which comes with our authoring tool, QuickBranch, for you content authors out there), we found that of the 1701 students who made it to this point in the lesson, 987 (or 58% of them) decided to work through the proof. And of those, 779 (~78%) made it all the way through the proof. We found similar rates in other lessons as well, like the Pythagorean theorem. More often than not, in our self-paced learning environment, students will choose to work through the proof.
When they're asking "Will this be on the test?," it's not that students don't care or don't want to learn. It's that their time is limited, and they're trying to optimize. If you free them from these constraints, and let them come back to the proof when they feel motivated or incentivized to really learn it, they'll choose to do so.
So when it comes to material that won't be on the test, trust your students and give them the choice. They may surprise you.
And so with our lessons, we tried something a little different. We know students aren't interested in working through proofs all the time. But sometimes they're genuinely curious, or they already passed the quiz and are now returning to see why the rule they memorized actually works. So we try to leave it up to the student to choose: if you want to work through a proof, click here; to just do practice problems, click here instead.
Here's an example from our lesson on inscribed angles:
After a student sees the rule in action and plays with an interactive, he or she can decide whether to work through a proof or skip ahead. If they opt for the proof, they'll work with general angles and prove it for themselves.
Otherwise, practice problems it is!
While it's nice that we give students the choice, we're really hoping they'll elect to work through the proof. They'll get more experience with geometric proofs, get more practice with algebra, and convince themselves that the rule truly works. So how many students decide to work through the proof?
Using our analytics dashboard (which comes with our authoring tool, QuickBranch, for you content authors out there), we found that of the 1701 students who made it to this point in the lesson, 987 (or 58% of them) decided to work through the proof. And of those, 779 (~78%) made it all the way through the proof. We found similar rates in other lessons as well, like the Pythagorean theorem. More often than not, in our self-paced learning environment, students will choose to work through the proof.
When they're asking "Will this be on the test?," it's not that students don't care or don't want to learn. It's that their time is limited, and they're trying to optimize. If you free them from these constraints, and let them come back to the proof when they feel motivated or incentivized to really learn it, they'll choose to do so.
So when it comes to material that won't be on the test, trust your students and give them the choice. They may surprise you.
Can you learn Algebra in 24 hours?
How long does it take a typical student to learn algebra? This is a tough question, for a variety of reasons. Who is a "typical" student? How do you know if they've really learned the material? And what's the best way to measure the total time a student spends learning?
To get a rough estimate, consider a typical high school or college course: a student will spend about 6 hours per week in a yearlong course (or about 12 hours per week over a semester), which add ups to about 200 hours, a figure that includes both time in the classroom and homework.
With almost 40,000 students now enrolled in AlgebraX and GeometryX, our two courses on the edX platform, we were curious to see how much time students were spending on each course, and how this compares to the 200-hour figure. Now because the lessons and assessments of our courses are adaptive, different students spend different amounts of time. So to study a "typical" student, we added up all the median times that students spend on each lesson and assessment. To estimate an upper bound of how long students might spend, we also added up the 90th percentile times of the lessons and assessments, representing the slowest 10% of students.
We were definitely surprised by the results. As you can see in the graph below, the median total time to complete AlgebraX is less than 24 hours! And the same goes for GeometryX! The slowest 10th percentile of students take about twice as long, finishing AlgebraX in about 48 hours, and GeometryX in about 36 hours.
Next, we compared the median times for students with different backgrounds: 1) current high school students, 2) adult learners with no college degree, 3) learners with an Associates or Bachelors degree, 4) learners with a graduate degree. As you can see in the graph below, the surprise this time was that all four groups typically finished the coursework in the same amount of time.
So what does this all mean? First off, this does not mean that our students have been learning algebra or geometry from start to finish in a single day. We're talking about the total time spent within the courses, which most students complete over multiple weeks. Also, while we stratified students by educational background, edX students are probably not your "typical" students. And while students demonstrate mastery of the subject material by completing formative and summative assessments, this does not represent a rigorously designed study of precisely what a student has learned.
But what these results do suggest is that with adaptive lessons and assessments, it appears possible for the majority of online students to work through and demonstrate competence for a year's worth of course content in a fraction of the time. It seems quite plausible that a "typical" student can learn algebra or geometry in less than 24 hours.
To get a rough estimate, consider a typical high school or college course: a student will spend about 6 hours per week in a yearlong course (or about 12 hours per week over a semester), which add ups to about 200 hours, a figure that includes both time in the classroom and homework.
With almost 40,000 students now enrolled in AlgebraX and GeometryX, our two courses on the edX platform, we were curious to see how much time students were spending on each course, and how this compares to the 200-hour figure. Now because the lessons and assessments of our courses are adaptive, different students spend different amounts of time. So to study a "typical" student, we added up all the median times that students spend on each lesson and assessment. To estimate an upper bound of how long students might spend, we also added up the 90th percentile times of the lessons and assessments, representing the slowest 10% of students.
We were definitely surprised by the results. As you can see in the graph below, the median total time to complete AlgebraX is less than 24 hours! And the same goes for GeometryX! The slowest 10th percentile of students take about twice as long, finishing AlgebraX in about 48 hours, and GeometryX in about 36 hours.
Total time spent on lessons and assessments in AlgebraX and GeometryX. For each course, the lower curve shows median time, while the upper curve shows the slowest 10th percentile of students. |
Next, we compared the median times for students with different backgrounds: 1) current high school students, 2) adult learners with no college degree, 3) learners with an Associates or Bachelors degree, 4) learners with a graduate degree. As you can see in the graph below, the surprise this time was that all four groups typically finished the coursework in the same amount of time.
So what does this all mean? First off, this does not mean that our students have been learning algebra or geometry from start to finish in a single day. We're talking about the total time spent within the courses, which most students complete over multiple weeks. Also, while we stratified students by educational background, edX students are probably not your "typical" students. And while students demonstrate mastery of the subject material by completing formative and summative assessments, this does not represent a rigorously designed study of precisely what a student has learned.
But what these results do suggest is that with adaptive lessons and assessments, it appears possible for the majority of online students to work through and demonstrate competence for a year's worth of course content in a fraction of the time. It seems quite plausible that a "typical" student can learn algebra or geometry in less than 24 hours.
School Yourself featured on Innovation Showcase
Last Friday, School Yourself CEO Zach Wissner-Gross sat down with Jay Sugarman, the host of Innovation Showcase on NewTV (based in Newton, MA). During the half-hour program, they discussed the origins of School Yourself, how the team personalizes learning at scale, and what's on the horizon.
In other news, we're proud to announce that the students in AlgebraX and GeometryX, our two MOOCs on edX, have now collectively solved more than 5 MILLION review questions. We're very proud of this milestone. And more students are signing up and working through these courses every day!
In other news, we're proud to announce that the students in AlgebraX and GeometryX, our two MOOCs on edX, have now collectively solved more than 5 MILLION review questions. We're very proud of this milestone. And more students are signing up and working through these courses every day!
Finding exactly where algebra gets hard
Now that thousands of students have gone through the 88 interactive lessons of AlgebraX (not to mention the 92 lessons of GeometryX), we're starting to see trends emerge. We previously found how engaging our interactive lessons are (compared to plain video lessons), and in this post we'll dive into the content itself.
One thing we hear over and over again is how hard algebra is. It's true -- learning algebra can be quite challenging. But it's also important: every subsequent math subject (geometry, trigonometry, calculus, statistics, and so on) builds on the foundations of prior ones, starting at algebra.
With AlgebraX, we're trying to make it easier (and more fun!) to pick up algebra. So are we accomplishing this? Where should we be improving our lessons? (That's something we ask ourselves all our time, because our authoring platform is uniquely suited toward iteration and improvement.)
One way to measure the difficulty of different topics is to look at how many students completed review questions (the assessments in AlgebraX) vs. how many students attempted those questions. If a topic is hard to learn (and/or our lesson on that topic could use improvement), we'd expect to see fewer students complete the corresponding review. Now this differs from how advanced a topic might be. For example, factoring quadratic polynomials is pretty advanced stuff for Algebra I, but students zipped right through those review questions in the course.
Here's a bar graph showing the completion of all the reviews in AlgebraX. Each topic has its own bar, and the vertical axis shows the percentage of students who started a review who finished it (meaning they mastered the topic).
The first thing to notice is that it's mostly green. In fact, the average completion rate for AlgebraX reviews is 96.1%. Once a students starts a review, it's highly likely that he/she will finish it. So what about those red and yellow areas? Well, here's a list of the hardest topics in algebra -- a "hit-list," if you will, where we'll focus on improving our lessons:
One thing we hear over and over again is how hard algebra is. It's true -- learning algebra can be quite challenging. But it's also important: every subsequent math subject (geometry, trigonometry, calculus, statistics, and so on) builds on the foundations of prior ones, starting at algebra.
With AlgebraX, we're trying to make it easier (and more fun!) to pick up algebra. So are we accomplishing this? Where should we be improving our lessons? (That's something we ask ourselves all our time, because our authoring platform is uniquely suited toward iteration and improvement.)
One way to measure the difficulty of different topics is to look at how many students completed review questions (the assessments in AlgebraX) vs. how many students attempted those questions. If a topic is hard to learn (and/or our lesson on that topic could use improvement), we'd expect to see fewer students complete the corresponding review. Now this differs from how advanced a topic might be. For example, factoring quadratic polynomials is pretty advanced stuff for Algebra I, but students zipped right through those review questions in the course.
Here's a bar graph showing the completion of all the reviews in AlgebraX. Each topic has its own bar, and the vertical axis shows the percentage of students who started a review who finished it (meaning they mastered the topic).
- Two equations, two unknowns (78.3% completion)
- Fractional exponents (82.0%)
- Two equations, with no solution (84.1%)
- Solving for intercepts (85.0%)
- Point-slope form (86.7%)
- Discriminants and roots (87.5%)
- The quadratic formula (87.9%)
- Solving multi-step equations (88.1%)
- Multi-variable equations (90.5%)
- Distributing roots (91.0%)
- Perpendicular slopes (91.3%)
- Calculating averages (91.4%)
And what's the "easiest" topic? You might think it's something early on in the course, but then you'd be wrong. It's simplifying expressions by combining like terms, a topic that 99.8% of students have mastered.
We're already underway improving the more challenging topics in the course, both by improving the lessons themselves, as well as by adding new hints to review questions. Our goal is to turn this entire chart green. Yes, algebra will still have advanced topics, but nothing will be too hard for students to conquer!
MOOCs are getting personal, but also more engaging!
All the lessons of AlgebraX (one of our math MOOCs on edX) have now been posted. And the feedback from students has been incredible! We still have two more weeks of lessons to post for GeometryX, on surface area and transformations, and then that MOOC will be complete as well.
As edX CEO Anant Agarwal has said, our courses represent the first two adaptive MOOCs on the edX platform, where students can "choose their own adventure" through each lesson, enjoying an experience that is tailored to their individual knowledge and abilities. And we're using all the data we're collecting to improve these lessons and add more paths for different learners to follow.
Despite our MOOCs being only a few months old, we can already measure how they're being received. One important metric is engagement, or how long students' attention is focused. One-on-one learning certainly is more engaging than a lecture, but does this trend translate over to MOOCs? How do interactive, personalized lessons compare to passive video content when it comes to online learning?
Researchers have studied how long students will watch videos on edX before "dropping out," which in this case means closing the video.
This data was collected from four of edX's most popular MOOCs. One way to interpret this graph is to look at where the trend line crosses 50% (at about 4 minutes). So after about 4 minutes of a video, half the students have dropped out. With passive video content, MOOC students have an attention span of 4 minutes.
So how do interactive, personalized lessons compare? Well, here's the corresponding graph containing all the lessons from our AlgebraX and GeometryX MOOCs:
You're reading that correctly -- the trend line doesn't cross 50% until about 22 minutes. So with our interactive personalized lessons, attention span is 22 minutes, a 450% increase over passive video.
Now this isn't a perfect comparison. The user interface for our lessons is a little different from the traditional YouTube scrub bar that appears in most edX videos. Also, because of the adaptive nature of our lessons, different students will spend different amounts of time on any given lesson. So to generate the above graph, we used the average lesson duration among students who completed the lesson. But despite these difficulties in comparing interactive versus passive video content, the difference is striking.
It's our hope that online learning (and MOOCs in particular) continue to become more personalized and interactive over the coming years. It seems that with this evolution, increased engagement will be an added bonus!
As edX CEO Anant Agarwal has said, our courses represent the first two adaptive MOOCs on the edX platform, where students can "choose their own adventure" through each lesson, enjoying an experience that is tailored to their individual knowledge and abilities. And we're using all the data we're collecting to improve these lessons and add more paths for different learners to follow.
Despite our MOOCs being only a few months old, we can already measure how they're being received. One important metric is engagement, or how long students' attention is focused. One-on-one learning certainly is more engaging than a lecture, but does this trend translate over to MOOCs? How do interactive, personalized lessons compare to passive video content when it comes to online learning?
Researchers have studied how long students will watch videos on edX before "dropping out," which in this case means closing the video.
Measuring engagement for passive videos on edX |
This data was collected from four of edX's most popular MOOCs. One way to interpret this graph is to look at where the trend line crosses 50% (at about 4 minutes). So after about 4 minutes of a video, half the students have dropped out. With passive video content, MOOC students have an attention span of 4 minutes.
So how do interactive, personalized lessons compare? Well, here's the corresponding graph containing all the lessons from our AlgebraX and GeometryX MOOCs:
Measuring engagement for School Yourself's AlgebraX and GeometryX MOOCs on edX |
You're reading that correctly -- the trend line doesn't cross 50% until about 22 minutes. So with our interactive personalized lessons, attention span is 22 minutes, a 450% increase over passive video.
Now this isn't a perfect comparison. The user interface for our lessons is a little different from the traditional YouTube scrub bar that appears in most edX videos. Also, because of the adaptive nature of our lessons, different students will spend different amounts of time on any given lesson. So to generate the above graph, we used the average lesson duration among students who completed the lesson. But despite these difficulties in comparing interactive versus passive video content, the difference is striking.
It's our hope that online learning (and MOOCs in particular) continue to become more personalized and interactive over the coming years. It seems that with this evolution, increased engagement will be an added bonus!
Subscribe to:
Posts (Atom)