LASI Day #3 – Morning Sessions Smorgasbord

Now onto day 3 at LASI 2013 and a lot happened this morning across three panels and a 45 minute breakout (birds of a feather) session so I’m just going to touch on a few things that really stood to me.

Taylor Martin & Nicole Forsgren Velasquez did a really nice talk about the kinds of learning analytics they are using to understand and evaluate student strategies in problem solving (in particular using a game they have developed called Refraction). What they were able to do is break down student solving strategies into 5 different categories (Slow & Steady, Haphazard, Explorer, Strategic Explorer, and Careful), and perhaps more interesting they were able to understand the results each of these strategies tended to produce (here’s link in a bit to their powerpoint to get the nitty gritty). Phil Winne, followed this up with another really interesting talk about understanding thinking by students and trying to get “into” their heads – both these talks really drove home the point that one thing that LA can do is help us get a sense of how kids are thinking about and acting on/within the curricula we design. The last speaker in the session was Sidney D’Mello and he talked about students’ emotions and learning – in particular he offered the approach of “Learning with Contradictions” – an approach that promotes disequilibration of students to promote reflection and learning and definitely resonates with Manu Kapur’s ideas around Productive Failure (2008). Check out the whole session here.

During the breakout session later in the morning Alyssa Wise organized a group of us around trying to get at the big ideas of “What problems do we think learning analytics can solve? WHOSE problems are they?” I thought this was a great way for us to start really thinking deeply about why we’re at this conference and what we’re trying get as a productive outcome to help us in shaping our research and the field. Judy Kay had our group brainstorm what these questions meant to us, and what kinds of small more focused questions could help us answer the larger ones promoted by Alyssa. For me (which is a bit different than some of the others at the conference) the challenge that I believe LA can help address is during complex real-time enactments, especially in unpredictable inquiry activities. TO his end there is an interesting issue about the “granularity” or the scale of the analytics and the interventions that we want to inform/act upon. To me there are at least 3 that stand out as a baseline:

  • Real-time in class, supporting on the fly decisions about classroom orchestration
  • After or between classes, giving more detailed information about the state of the class’ knowledge or performance to aid in scripting upcoming classes
  • After the course/unit, for assessment and also for self-reflection of the teaching and learning outcomes (this might also be really valuable for students too)

Finally Phil Winne reminds us that students are agents, who make choices throughout learning activities, even unexpected ones.

LASI Day #2 – Morning Talks, Learning Analytics, Intuition, and Data Driven Design

A couple of really interesting talks today by some serious heavyweights in the learning, technology, and innovation fields: Stephen Coller (Gates Foundation), Ken Koedinger (CMU) & Ryan Baker (Columbia).

Stephen Coller opened the talks discussing how learning analytics can help us transform education and the challenges of bringing this to “market” (adoption in larger scales). Stephen noted that there is a common goal of most people in education to pursue the improvement of “the instructional core”. What I found particularly interesting is that he touted 3 key ways to  really improve learning gains for students we need: (1) Change the rigour of the content that students are being asked to interact with; (2) Increase the knowledge and the skills of the teachers teaching the content; or (3) Alter the relationship of the student to the teacher and the content. Each of these are really powerful (but complex) means for making these changes and Stephen pointed out that changing one will probably have an effect on the other three as well.

He also noted the minerva project which approaches higher education through an interdisciplinary, real-world, authentic problem based curriculum. He noted that this form of learning may very well find its way into smaller college settings. This growing of authentic skills which are applied to real problems is really interesting and I could see it being a real motivator for students within such curricula.

Ken and Ryan did back to back talks on how they focus on what drilling deeply into student interaction and result data has told them about their designs and how to either improve them or to give direct insight into teaching and instruction practices. What really stood out here was the notion of “expert blind spots” – that we as researchers too often feel we know what we know (when in truth we might be missing critical factors or insights because we aren’t looking at  the issue objectively). Instead Ken and Ryan showed how LA and data mining can reveal things about our curricular designs that we may not have seen (and could be adversely affecting our designs). Ken stated that “Intuitive design is not reliable”, and that careful analysis was more fruitful. I challenged him slightly on this noting that intuition is sometimes need in new or innovative designs (where we don’t have rich data from which to build on), he gave me one of the most memorable quotes of the conference “The hare of intuitive design and the tortoise of cumulative science”.