There’s an allegory from Buddhism that describes how four blind men are led to an animal they’ve never encountered before. Feeling the animal with their hands, each man describes what they “see”. One man says it’s very much like a rope. Another says it’s less like a rope and more like a tree trunk. Another says, that’s not even close; it’s really just a giant, soft wall. A third says not a wall so much as a a heavy cloth curtain.
Of course they’re each describing just one part of a very large animal: an elephant.
The moral of the story tends to be that people interpret the world based on their limited experience. And that we hold to our own while discounting others’ different interpretations, even though, in the end, we may only be correct cumulatively.
I’ve felt a lot like one of those blind men (though arriving at a very different moral) as part of a team that has been further exploring learning analytics over the past 18 months. We've talked with over 30 institutions in the US, AU, UK, and Europe, learning what they are doing with learning data, and discussing our ideas for Canvas Analytics 2 (and beyond).
It’s relatively easy to say what learning analytics is — from whatever point of view you approach it. For some, learning analytics is a way to hold students accountable for the work required to earn a grade. Or perhaps learning analytics is a way of predicting user behavior so you can more promptly intervene. Or a way of identifying the most effective course designs and teaching practices. Or a way of representing teaching and learning to the agents (teacher or learner) for metacognitive goals.
These are all learning analytics projects or practices that we have heard or read about happening today. It is much harder to say what learning analytics will be once we have a full picture of the beast — that is, as different projects with different aims develop and produce results over time.
We talked about learning analytics in a panel discussion this last month during CanvasCon Scandinavia. Panelists included…
Jane James of University of Birmingham, which is working on using data and analytics to both understand and catalyze the use of technology while "softening" the terminology so it's more accessible and approachable by faculty and students.
John de Maria of Stiftelsen Viktor Rydbergs Skolor, which is working on optimizing data on teaching for teachers as a means of empowering them to drive instructional improvement, primarily for the purpose of ensuring every student is adequately engaged as part of their digital transformation initiative.
Eric Slaats of Fontys ICT, which is delivering data and analytics to students for their own benefit as they continue to transform the conventions of teaching and learning. Fontys ICT is also seeking insights from broader personal and environmental data -- “anything we can get our hands on” -- in order to understand external influences on learning.
What did we learn from this panel, aside from the fact that learning analytics is a very large beast indeed? For me, there were four key takeaways from the shared perspectives:
1.Teaching faculty need to support and buy-in to the goals of any learning analytics project and be given power to use the data in pursuit of better teaching. Many faculty have strong, legitimate fears that data could be used to either disempower educators or misrepresent teaching and learning with incomplete or inaccurate data. Indeed, in some versions of the elephant allegory, the blind men come to think the others liars and it creates a rift in the group.
2. Students are too often ignored in these conversations, except as the subject of learning analytics — or the object of actions we might take. But if this is indeed the students’ data and if we want students to become not just objects of our teaching, but reflective, self-regulated learners in their own right, students should come first.
3. Administrators and advisors believe learning analytics can help them do their jobs better by highlighting people or courses that most deserve their attention. Data and analytics -- if made comprehensible -- can give more immediate insight into how students are doing and ways of replicating the best practices of teachers. It's up to technologists to make that data both easily accessible and “safe” in context of the teacher and student issues highlighted above.
4. Culture matters, both the local culture of teaching and learning within the institution or program, and the educational culture of the region or country. In some departments, leadership oversight over teaching is the norm; in other departments, teaching faculty hold academic freedom as critical to their success. In some countries, frequent, scored assessments and activities are the norm; in other countries, scored assessments are infrequent, though student interaction may still be high.
Though learning analytics has made progress in small but significant ways like predicting if a student will pass or fail a course or recommending course content, we still don’t know much about whether data can tell us about effective teaching and learning. Learning analytics will need to prove its value in tackling these much larger challenges over time, data model by data model. That means we we must plan long-term research that consists of much trial and error, building up and tearing down. In the meantime, there is real risk that immature, untested models could have negative consequences on teachers and students alike. Like the blind men in the elephant allegory, this means many of us must work together and share our experience, our results, and even our biases in order to understand this strange learning analytics beast.
We must also balance the necessity of iterative development — including trial and error — with sensitivity and caution for the people for whom we intend to serve. This is where we diverge from the blind men — sure, we want a true understanding of something new, grand, and exciting. But we don’t do learning analytics just to improve our understanding of online learning; we do learning analytics for the students and teachers we serve.
Keep learning,
Jared Stein
VP of Higher Education Strategy, Instructure
Related Content
- The Power of Formative Assessment 7 Ways It Can Benefit Learners.png
Blogs
- formative_assessment_1.png
Blogs
- integrated-learning-systems.jpg
Blogs