Regular readers of this blog will know that I am quite sceptical of the efficacy of online teaching and learning, and I have previously written about the evidence coming out of studies during the pandemic (see here, and for more see the list of links at the bottom of this post). My take on the evidence so far is that we need to carefully evaluate the changes that the pandemic has forced on teaching, in relation to the move to online. So, I was interested to read this recent article by George Orlov (Cornell University) and co-authors, published in the journal Economics Letters (ungated earlier version here).
Orlov et al. evaluate the impact of the pandemic-enforced shift to online teaching on learning in seven intermediate-level economics courses at "four R1 PhD-granting institutions" (translation: top US universities). Their study has the advantage of being based on low stakes post-test evaluations that use the same questions, implemented in the Spring or Fall semester of 2019 (i.e. before the pandemic, when teaching was in-person), and again in the Spring semester of 2020 (i.e. where part of the semester was affected by the pandemic and learning had shifted online). In relation to the teaching, they note that:
Six of the seven classes were taught synchronously during the remote instruction period with lectures delivered using Zoom. The seventh instructor pre-recorded lectures and spent the scheduled class time in Zoom answering student questions about the material.
So, that pretty much reflects practice across all universities during this time. Because the pandemic affected only part of the Spring 2020 semester, Orlov et al. are able to isolate its effect on the topics that were taught remotely, as well as look at the effect overall (including topics that were taught online). Based on their sample of 809 students (476 pre-pandemic, and 333 in the pandemic-affected semester), they find that:
...in the pandemic semester, the overall score drops by 0.185SD (p = 0.015) while the remote subscore drops by 0.096SD (p = 0.181). A possible explanation for the discrepancy is that these scores measure learning of topics taught closer to the administration of assessments, which potentially would be fresher in students’ memory. Furthermore, at the institutions in this study, there was an extended break (up to three weeks) before the remote portion of the semester started. Overall, these results suggest that student outcomes did suffer in the pandemic semester and the magnitudes of the declines in learning were not trivial.
So far, so not good for student learning. Looking at differences across subgroups (gender, race, first-in-family students, students without English as a first language), Orlov et al. find no statistically significant differences. Then, looking at teaching factors, they find:
...evidence that instructor experience and course pedagogy played important roles in ameliorating the potentially negative effects of the pandemic on learning. When the instructor had prior online teaching experience, student scores were significantly higher overall (0.611SD, p = 0.074) and for the remote material (0.625SD, p = 0.000). Students in classes with planned student peer interactions earned scores that were similar relative to students in other classes on the overall scores and 0.315SD higher (p = 0.040) for the material taught remotely.
That suggests that, as teachers gain more experience with online teaching, the negative impacts may reduce. However, even more importantly, it suggests a method of teaching that may help. Orlov et al. defined peer interactions as:
...the use of at least two of the following strategies: 1) classroom think-pair-share activities... 2) classroom small group activities, 3) encouraging students to work together outside class in pre-assigned small groups, and 4) allowing students to work together on exams.
These sorts of interactive learning approaches are slightly more difficult to execute in the synchronous online environment (and unlikely achievable in an asynchronous format), but it appears that they can be done well, and contribute to student learning (or, at least, reduce the negative impact of online learning). However, we shouldn't read too much into this study. While they did sub-group analysis, the study was based on a sample from top US universities, which probably doesn't extrapolate well to lower-ranked institutions, where students may be less motivated or engaged. And, the teaching methods were not randomly assigned, so we can't attribute causal explanations to the differences. Clearly, we still need more work in this area.
Read more:
- Online vs. blended vs. traditional classes
- Flipped classrooms work well for top students in economics
- Flipped classrooms are still best only for the top students
- Online classes lower student grades and completion
- Online classes may also make better students worse off
- Meta-analytic results support some positive effects of using videos on student learning
- New review evidence on recorded lectures in mathematics and student achievement
- Meta-analytic results may provide some support for flipping the classroom
- The value of an in-person university education
- The pandemic may have revealed all we need to know about online learning
- Live streamed video lectures and student achievement
- Low-performing students, online teaching, and self-selection
No comments:
Post a Comment