Relevant Data Can Help Tell the Story

Today’s entry is a review of a past reflection piece I wrote based on the 2013 post by D’arcy Norman, Ph.D., called “Lessons learned: AV systems design in the Taylor Institute.” I will again use this same piece as I consider how administrators might go about understanding and evaluating how users have adopted the audio-visual (AV) system, specifically understanding the use of the space and collaboration carts and, secondly, whether there is a positive relationship between use of the technology and space at the Taylor Institute (TI) and academic performance.

There are many institutes that have begun incorporating data analytics into their practice to assist learners in identifying their own learning needs to encourage instructors to identify and provide support earlier in the learning journey to reduce attrition. Learning analytics can vary widely and commonly include data collected from Virtual Learning Environments (VLE’s) and can measure such inputs as attendance, assignment submissions, demographic information and engagement, to mention only a few. Collecting and measuring student behaviour is based on predictive analytics to help facilitators identify high-risk students and reach out via message automated messages or personal communication (Prinsloo & Slade, 2014; & Sclater et al., 2016). Transparency over data collection is essential; however, evidence suggests that students have found these tools to be effective and well-received (Sclater et al., 2016). In addition to data collected through VLE, Sclater et al. describe other data collection methods that can complement the narrative of student success and shed light on factors that support high completion of courses; among these analytics includes swipe cards, proximity cards and other entry system capturing tools (2016). As an administrator, it would be valuable to use data that measures and logs student access to the Taylor Institute and the use of the collaboration carts either via a computer log-in or card swipe-type system. Using this type of monitoring system and comparing the data to learner outcomes might be a start to understanding if any correlation exists between the opportunity for students to access technology and if this access leads to increased knowledge or learning via technology. The value of this relationship could have potential user policy implications for whether there is educational value in keeping the doors unlocked when not in use, especially if there are high costs associated with repairing and replacing equipment; on the other hand, it might lead to further questions whether retention is gained through the creating meaningful relationships for students in regards to their learning experience.

References

Prinsloo, P., & Slade, S. (2014). View of Educational triage in open distance learning: Walking a moral tightrope. The International Review of Research in Open and Distance Learning, 15(4), 306–331. https://www.irrodl.org/index.php/irrodl/article/view/1881/3060

Sclater, N., Peasgood, A., & Mullan, J. (2016). Learning Analytics in Higher Education A review of UK and international practice Full report.

By: Nicole Croft

One thought on “Relevant Data Can Help Tell the Story

  1. From your post I can understand how all this data can be very useful to institutes, both in helping individual students, and in determining how to optiminally run the institute. From your post which cites Sclater (2016) it says that “students have found these tools effective” and that students were supportive of this data collection. I think this type of learning data can be beneficial, my concern lies in whether or not students have a choice about whether or not this data is collected. Some may prefer to not have non-essential learning data collected. Do you think institutes should offer students a choice on whether this non-essential data is collected or not?

Leave a Reply

Your email address will not be published. Required fields are marked *