As I reflect on what I have learned about learning analytics, my thoughts return to my original post about Clint Lalondeās work on Open Homework Systems (OHS) (Donahue, 2022). As argued, OHS may be beneficial for students transitioning from K-12 into higher education by addressing skill weaknesses or deficits. Learning analytics, understood as āthe measurement, collection, analysis, and reporting of data about learners and their contexts, for the purposes of understanding and optimizing learning and the environments in which it occursā (Siemens, 2013, p. 1382) may support the effective use of OHS to ensure student success in higher education by flagging or identifying students who are most in need of intervention and support.
As outlined in Jiscās (2016) report, learning analytics have been utilized in a variety of ways. The University of New Englandās early alert system provided students with opportunities to develop a sense of community and share their feelings, and alerted support staff to students who were struggling. Purdue University looked at student performance, Virtual Learning Environment (VLE) interaction, academic history, and student characteristics to produce āa ātraffic lightā indicator showing how at risk each student is considered to beā (p. 27). Nottingham Trent University used predictive analytics based on data about student engagement and deliberately did not include demographic data because āa student can only alter their behaviour, not their backgroundā (p. 35). In each of these cases, different data was collected to flag struggling students, but in all cases data was collected to identify students who potentially needed support.
It is clear that learning analytics may be useful in identifying at-risk students. OHS developers could utilize this information to develop content in areas of student weakness and target the population of students who would benefit most from OHS support. However, it is important to be mindful of the darker side of learning analytics and remember that āthe learning process is essentially social and cannot be completely reduced to algorithmsā (Siemens, 2013, p. 1395). Prinsloo and Slade (2014) further argued that there are moral implications of collecting data to triage educational needs, and provided a reminder that institutions must always act in studentsā best interests. Eaton (2021) supported this notion and contended that transparency must be equal at all levels to ensure that students are ācitizens in and not objects of the LMSā (para. 15). While learning analytics can certainly be utilized to support student success, the ethical and moral considerations must be of paramount importance to all stakeholders.
References
Donahue, A. (2022, February 12). The potential of Open Homework Systems (OHS) to bridge the gap between K-12 and higher ed. In M. Harrison, Managing change in digital learning. https://untextbookdemo.opened.ca/voice/the-potential-of-open-homework-systems-ohs-to-bridge-the-gap-between-k-12-and-higher-ed/
Eaton, L. (2021, December 10). The new LMS rule: Transparency working both ways. The Journal of Interactive Technology & Pedagogy. https://jitp.commons.gc.cuny.edu/the-new-lms-rule-transparency-working-both-ways/
Prinsloo, P., & Slade, S. (2014). Educational triage in open distance learning: Walking a moral tightrope. International Review of Research in Open and Distributed Learning, 15(4), 306ā331. https://doi.org/10.19173/irrodl.v15i4.1881
Sclater, N., Peasgood, A., & Mullan, J. (2016). Learning analytics in higher education: A review of UK and international practice. Jisc. https://www.jisc.ac.uk/sites/default/files/learning-analytics-in-he-v3.pdf
Siemens, G. (2013). Learning analytics: the emergence of a discipline. The American Behavioral Scientist, 57(10), 1380ā1380. https://doi.org/10.1177/0002764213498851