Now onto day 3 at LASI 2013 and a lot happened this morning across three panels and a 45 minute breakout (birds of a feather) session so I’m just going to touch on a few things that really stood to me.
Taylor Martin & Nicole Forsgren Velasquez did a really nice talk about the kinds of learning analytics they are using to understand and evaluate student strategies in problem solving (in particular using a game they have developed called Refraction). What they were able to do is break down student solving strategies into 5 different categories (Slow & Steady, Haphazard, Explorer, Strategic Explorer, and Careful), and perhaps more interesting they were able to understand the results each of these strategies tended to produce (here’s link in a bit to their powerpoint to get the nitty gritty). Phil Winne, followed this up with another really interesting talk about understanding thinking by students and trying to get “into” their heads – both these talks really drove home the point that one thing that LA can do is help us get a sense of how kids are thinking about and acting on/within the curricula we design. The last speaker in the session was Sidney D’Mello and he talked about students’ emotions and learning – in particular he offered the approach of “Learning with Contradictions” – an approach that promotes disequilibration of students to promote reflection and learning and definitely resonates with Manu Kapur’s ideas around Productive Failure (2008). Check out the whole session here.
During the breakout session later in the morning Alyssa Wise organized a group of us around trying to get at the big ideas of “What problems do we think learning analytics can solve? WHOSE problems are they?” I thought this was a great way for us to start really thinking deeply about why we’re at this conference and what we’re trying get as a productive outcome to help us in shaping our research and the field. Judy Kay had our group brainstorm what these questions meant to us, and what kinds of small more focused questions could help us answer the larger ones promoted by Alyssa. For me (which is a bit different than some of the others at the conference) the challenge that I believe LA can help address is during complex real-time enactments, especially in unpredictable inquiry activities. TO his end there is an interesting issue about the “granularity” or the scale of the analytics and the interventions that we want to inform/act upon. To me there are at least 3 that stand out as a baseline:
- Real-time in class, supporting on the fly decisions about classroom orchestration
- After or between classes, giving more detailed information about the state of the class’ knowledge or performance to aid in scripting upcoming classes
- After the course/unit, for assessment and also for self-reflection of the teaching and learning outcomes (this might also be really valuable for students too)
Finally Phil Winne reminds us that students are agents, who make choices throughout learning activities, even unexpected ones.
A couple of really interesting talks today by some serious heavyweights in the learning, technology, and innovation fields: Stephen Coller (Gates Foundation), Ken Koedinger (CMU) & Ryan Baker (Columbia).
Stephen Coller opened the talks discussing how learning analytics can help us transform education and the challenges of bringing this to “market” (adoption in larger scales). Stephen noted that there is a common goal of most people in education to pursue the improvement of “the instructional core”. What I found particularly interesting is that he touted 3 key ways to really improve learning gains for students we need: (1) Change the rigour of the content that students are being asked to interact with; (2) Increase the knowledge and the skills of the teachers teaching the content; or (3) Alter the relationship of the student to the teacher and the content. Each of these are really powerful (but complex) means for making these changes and Stephen pointed out that changing one will probably have an effect on the other three as well.
He also noted the minerva project which approaches higher education through an interdisciplinary, real-world, authentic problem based curriculum. He noted that this form of learning may very well find its way into smaller college settings. This growing of authentic skills which are applied to real problems is really interesting and I could see it being a real motivator for students within such curricula.
Ken and Ryan did back to back talks on how they focus on what drilling deeply into student interaction and result data has told them about their designs and how to either improve them or to give direct insight into teaching and instruction practices. What really stood out here was the notion of “expert blind spots” – that we as researchers too often feel we know what we know (when in truth we might be missing critical factors or insights because we aren’t looking at the issue objectively). Instead Ken and Ryan showed how LA and data mining can reveal things about our curricular designs that we may not have seen (and could be adversely affecting our designs). Ken stated that “Intuitive design is not reliable”, and that careful analysis was more fruitful. I challenged him slightly on this noting that intuition is sometimes need in new or innovative designs (where we don’t have rich data from which to build on), he gave me one of the most memorable quotes of the conference “The hare of intuitive design and the tortoise of cumulative science”.
After listening to a really nice opening panel at LASI 2013 and the idea of Big Data and Learning Analytics (LA) a couple things came to mind about this emerging field.
I’m always worried about the ideas of LA are going to put learning too much “on rails” – that is to say that we automate the process so much that students and teachers are taken out of the decision and learning processes by simply crunching the “data” and making decisions for them. I’m heartened to see that this concern is shared by the panel as well.
Alyssa Wise mentioned that LA needs to be “learner centered”, which I think is vital, even as we begin to gather and process all of this data to make sense of it we need to remember that it’s about the students and all of our practices need to be focused on this and how we can help and enable learners to learn. I was glad that Dan Suthers also pointed out that learning at it’s core is a complex phenomena, but that there is a promise being held out by LA to help us “understanding and manage learning in its full complexity”, and how we can it help optimize learning. My big question and one that I think should be central to this whole conference is what we mean by optimizing learning? A lot of the ideas in this conference is about this idea of optimization and I hope that we continue to discuss/debate what optimization means within complex and varying learning communities and approaches.
This idea by Dan goes very nicely with George Siemens idea of the increase of learner individualization, and with Phil Winne’s of engaging learners as participants in an ecology of experimentation. We want students to be authentic drivers of inquiry, investigation, and knowledge construction and we want to leverage LA as a means of aiding them in these processess – by connecting them new peers, new resources, new ideas that they may have been otherwise “blind” to (similar to what Dan said about weak ties).
My personal hope is that LA does live up to this ideal of really empowering learners to learn in ways that otherwise would be impossible (or prohibitively time consuming), and also critically giving teachers insight into the state of their class’ knowledge (and perhaps deeper information of the “global” state of knowledge) to drive learning and exploration in exciting new ways.
Only one morning in and so far very interesting and exciting – looking forward to the next few days!