When computers first made their way into the classroom in the 1980s, the few students who had access to them used them for less than 30 minutes a week. Today, in a time where we have the power of the internet in our pockets, classrooms look a little different.
Devices are part of the classroom norm, as are the digital programs, resources and learning management systems that work alongside them. Technology-supported teaching and learning has grown significantly over recent decades (particularly amidst the coronavirus pandemic). In the Teaching and Learning International Survey (TALIS), Australian schools were the third-biggest users of ICT of the participating OECD countries. And global investment in the EdTech market continues to rise, with it predicted to increase from $7bn in 2019 to $341bn by 2025.
But as digital programs, in particular, become increasingly available it is vital for educators to consider the value these resources actually provide. While they have shown to improve student engagement, we must consider whether the resource uses technology to make simple improvements, or whether it goes one step further by using technology to enhance pedagogy and support efficacy.
Walk into the 2021 classroom and you’ll see students completing gamified maths activities, or engaged in interactive simulations, or even using online editing programs to create a video for an upcoming project.
Technology is so integrated in classrooms it’s become a key component of how we teach and learn. In fact, more than three quarters of Australian teachers have reported that they frequently or always let students use ICT for projects, like class work, compared to 43% in the top performing jurisdiction, Singapore, and 53% across the OECD.
As our use of technology in the classroom grows, so too does our capacity for capturing learning data. Or so it should. After all, if Google can get to know us through our search results, then digital and online learning platforms should tell us about how our students learn… right?
Well, unfortunately this isn’t always the case. Often these programs don’t use technology to its full capacity, completely missing the opportunity to use data to support learning.
This is particularly frustrating when it comes to online content delivery programs. The whole point of this content is to achieve mastery by scaffolding a student’s learning through expertly written chapters or modules. But so many of these programs fall short of the mark.
The Substitution Augmentation Modification Redefinition (SAMR) Model gives us a lens to evaluate the extent to which a certain technology is impacting teaching and learning.
We can use the model to explore the example of an eBook. An eBook is essentially a traditional textbook that has been recreated digitally. In its new form, it can provide students with new benefits, like accessibility and affordability, but it still carries many of the downsides of the traditional textbook. EBooks provide a substitution (the best examples might stretch to an augmentation) of what textbooks do. For the most part, eBooks are still relatively static. There might be some updates and tweaks here and there, but for the most part the content in the eBook reflects that of the textbook. Additional questions can’t be added to give students more practice and wording isn't changed for clarification. Solutions are not edited to provide students with additional support and feedback if they need it. They have not redefined teaching and learning or transformed the way content is delivered.
We can see through the simple fact that eBooks don’t provide data on their impact with students. If there is data it’s often limited to things like the length of time the student had the eBook open, which tells very little about the efficacy or behaviours related to success in learning.
Without any information on how effective their eBooks are, the companies behind them can’t make edits to improve student learning. And the same goes for so many maths platforms that, although they present content to the students in an interactive way the lack of integrated continuous assessment prevents them from establishing a feedback loop for better content design. They can’t rethink questions or chapters to better support mastery because they don’t know if students are matersting them in the first place. And for the teachers who use them this can be particularly frustrating. They can see first hand the improvements that need to be made and often have to live with it, making workarounds for students.
If the technology is there, what does it look like in practice? And is there anyone out there using it?
Teachers have adapted their practice to incorporate data collecting tools that inform their teaching. In the same way, online learning programs should use their own data to continually improve their resources and in turn, support better student outcomes.
In practice, this would involve collecting data from all students who use these programs and using it to reflect on things like:
This process is made even better when teachers can also have input by submitting their observations from the classroom.
By continually improving and iterating content, students can progress faster with fewer misconceptions and a deeper understanding. That’s exactly why Maths Pathway has leveraged technology to collect the data of the 80,000 students using our model.
We now have the unique ability to use real-time data to update our content if and when issues arise or as a result of clear trends in our data. If thousands of students are tripping up on the same question our Learning Team knows about it and they can use their expertise to fix it. The insights from this data also enable the team to be more strategic and targeted allowing them to spend time where the learning impact will be greatest. But even better, they can see the efficacy of the change, ensuring the content they’re delivering is the best it can possibly be.
Download the whitepaper below
Leveraging learning analytics for better educational content design
Summary paper: Analysis of Maths Pathway module design