Learning Analytics - Can we rigorously measure impact on students?
When deciding to implement a learning analytics (LA) tool, institutions often seek out evidence that it will result in measurable benefits. As most LA interventions are likely to produce a positive effect, institutions look for tools that demonstrate the largest, most meaningful effects.
Course Signals claimed to improve student retention by 21% and many institutions chose to adopt it over other tools as it was seen as the leader in the field of predictive analytics. That was, however, until late 2013 when the veracity of its claims came into question and a simulation established that there was no causal link between taking a Signals-enabled class and staying in college. At that time the Course Signals study was one of the most cited LA papers and as such, the LA community's research approach was called into question.
As with many other users and developers of LA tools, we are currently facing the challenge of determining how best to rigorously measure impact on students. The Student Relationship Engagement System (SRES), developed at the University of Sydney, is a flexible tool that allows teachers to collect, curate, analyse and act upon data that is meaningful to their teaching practices. Although its human centred approach distinguishes it from many LA systems that rely on blind data mining and one-size-fits-all predictive analytics, the challenge of establishing impact remains. This is also compounded by the fact that if the system is working well, students should be generally unaware of its existence as it becomes seamlessly integrated into teachers’ practices.
We invite you to discuss and share your thoughts on how researchers and institutions can more rigorously measure the impact of both top-down and bottom-up learning analytics tools on students.
Our driving questions for this discussion are:
- How can the learning analytics community develop better measures of learning (i.e. moving beyond crude measures such as attendance)?
- How do you measure the impact of learning analytics when it is likely confounded by many other factors?
- How do you assess the impact of a human-centred LA tool (such as the SRES) with a range of functionality when it is being used in diverse contexts?
- Institutions have focused heavily on student retention. From a teacher's perspective, what data is the most important and useful?