Working in corporate learning for a private-sector organization, Iām often faced with the challenge of proposing one-size-fits-all technology solutions to meet the needs of different learning audiences (internal employees in different departments, and external partners who learn about our products). I am challenged to find learning systems which are adaptable to multiple audiences, and which will be sustainable over long periods of time so that we see our return-on-investment. Although his context was different working in Higher Ed to develop an Open Homework System, Clint Lalondeās article raised some similar themes in trying to build one Ed Tech solution for multiple audiences. One thing I considered when reading his article was how my own decisions might change about the different audiences (role perspectives) if I were following his own decision-making strategies, especially since I am unable to use Open Educational Resource (OER) systems in my corporate environment. My corporate learning team has often considered putting reinforcement training tools (such as Axonify) in place, but we would need to carefully build a strong case before being able to get backing for that sort of project. Lalonde explained in his context he was looking to choose from existing technologies instead of creating one from scratch (Lalonde, 2019), and I agree that would be best in my context in order to optimize implementation time.
Another project consideration Lalonde described was analyzing the needs of learners in different subject areas when considering how the Open Homework System would operate. He gave the example of STEM topics in particular requiring frequent practice and āinstant formative feedbackā in order to effectively build skills and correct errors (Lalonde, 2019). In the setting of corporate training I would likely need to focus on selecting systems which could meet the needs of multiple learners (for example, Claims Examiners who have limited direct contact with customers and Customer Service agents who are interacting directly with customers all day) or the solution would not be seen to have enough broad appeal. I would likely not have the luxury that Lalonde described of focusing on a system to meet the needs of a limited audience, unless it was as part of a pilot project where I could measure the Ed Tech solutionās effectiveness.
This comparison makes me wonder: Are the strategies we use to select and implement technologies in Higher Ed and Corporate Learning more alike or more different? What can we learn from each other and remix, re-think, and re-use to meet the needs of our individual contexts?
References
Lalonde, C. (2019, September 6). Some strategies for the Open Homework Systems project. EdTech Factotum. https://edtechfactotum.com/some-strategies-for-the-open-homework-systems-project/
Hi Andrea, I really liked how you were able to link the higher education piece to corporate learning, and a few points you made had me thinking about how data analytics could come into play as well when deciding on which technologies to implement.
Your first point about how challenging it can be to put together a strong business case to adapt (and expense!) new technology really hit home. I have been on multiple teams myself where we are charged with reviewing tech and deciding on implementation, and often times we love it, but it comes own to ‘can we sell upper management’. That is when the data collecting begins! And more often then not the team gets stuck in ‘analysis paralysis’, which is the act of continuously gathering data while never actually using said data to make a decision. Instead, the team sits in paralysis and doesn’t move forward.
Something I took away from Sclater et al’s report was that there is “a need for evidence based prioritisation [sic] of spending” (2016, p. 12) not just in higher ed but very much at a corporate level as well, and that can lead to a constant collection of data and no real output in regards to what to DO with the data, as though more info equals more money allocated. As KellosInsight (2015) notes, there is a problem with management collecting data for the sake of having the data without understanding what to then do with it. When it comes down to making the right decision on a product of technology, data analytics can be powerful when done right, as long as it can “be integrated into the business plan itself. Whatever a company chooses to measure, the results will only be useful if the data collection is done with purpose” (KelloggInsight, 2015, np).
References:
KelloggInsight. (2015, May 1). A Leaderās Guide to Data Analysis: A working knowledge of data science can help you lead with confidence. [Blog post]. https://insight.kellogg.northwestern.edu/article/a-leaders-guide-to-data-analytics/
Sclater, N., Peasgood, A., & Mullan, J. (2016). Learning analytics in higher education. London: Jisc. Accessed February, 8(2017), 176. https://d1wqtxts1xzle7.cloudfront.net/61301572/learning-analytics-in-he-v320191122-83155-kaxmyg-libre.pdf?1574439278=&response-content-disposition=inline%3B+filename%3DLearning_Analytics_in_Higher_Education_A.pdf&Expires=1709013878&Signature=D0PNt4RrSiTr-Ixmcs02~BzTBy7ItcWaCmjYeA3f9xyZvJN1JGE0dDnH6zN45hlH624YwpiytPY5dXrEHRCxkFfKf0rlYwITCe50L8~W5E~9RILDGRoxv2k4tTVGS7IrYGTs95jZQnScRAExYeI4KhF46GiRHRO8YZGxgDx1PNTGIXD9jZ28oAxwEt~uuLx8j7j9wYElbBXhvYG0LKzWlPc85nnY3Urt5a0wpgyvyCJc0Pdrv4nZgehAw6TtGO5HbbqdgFzsIA~CrW-rT3pyiY67sBNVJefO2qbA0z9g66D-hz–zhbqHbBfSF5lRkCfj14c6Z~Y96jJXV32KwSM1Q__&Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA
Hi Jessica and Andrea,
I find that we often run into this question – how do we decide what data we need to answer our questions (about a technology or learning intervention). I am not sure if I shared this earlier in a different post – but I really appreciate the research that Bart Rientes and his colleagues do at the OU to examine their learning designs. They linked learning analytics data, with learning success and learner satisfaction to map what types of learning activities lead to better outcomes. A very large study, but I think I appreciate how they used their analytics data, combined with other measures to answer a very big question (and one that is personally relevant to my context in a very similar organization). To your point Jessica, I think I have found myself in “analysis paralysis” with too much data, and then not knowing how to tell a story with it.
Toetenel, L., & Rienties, B. (2016). Analysing 157 learning designs using learning analytic approaches as a means to evaluate the impact of pedagogical decision making. British Journal of Educational Technology, 47(5), 981ā992. https://doi.org/10.1111/bjet.12423