Congratulations to the team at Durham for putting on another excellent conference. For me, this is the most useful event of the year for those of us with the task of supporting Blackboard. It is truly a ‘user’ led conference. The agenda is set by users and attendees are open and willing to share.
It was harder to pick out specific themes and emerging trends than at previous conferences. The conference title ‘Learning from Failure’ can be interpreted in different ways (helpful for those submitting papers but resulted in lots of different topics covered). Learning from mistakes is something we know we should all should do, as students, teachers and institutions. Google’s approaches to ‘failing faster and smaller’ come to mind, as does Kolb’s learning cycle and the ‘validated learning’ approach used in lean design methodologies. For me, the key is to manage mistakes by limiting the impact, whilst creating a culture in which (whilst we try to get things right first time) we accept that this will not always happen and learn from the process. Failing is often part of success, providing lessons are learnt and changes made. When I was working with academics to create MOOC materials, the materials we had to recreate several times after testing turned out to be the best bits of those courses.
Eric Stoller’s keynote on social media touched on the failure theme by suggesting we are likely to fail (and learn) as each new technology comes along. In doing so we (and our students) develop our digital capabilities. For example, we learn more about identity and risk in these new spaces. A takeaway message for me is that whilst some spaces are very much at the social end of the spectrum, where student learning will happen, usually without our intervention, we could encourage students to use other tools more. For example Linkedin is increasingly a tool of choice for recruitment. The age demographic for linkedin is 40+. Should we be encouraging students to sign up to these tools? Eric praised Linkedin Pulse, the publishing facility in Linkedin, and mentioned that the company had recently acquired Lynda.com (a very successful online learning materials provider).
Alan Masson from Blackboard (previously the University of Ulster) highlighted a change in focus to a more mature embedding of TEL. VLEs are now business critical, and need appropriate robustness with 24-7 availability. Institutions are looking at deeper integration with student information systems. Alongside this, Universities are looking at personalisation and re-use of tools for things outside of core teaching. Sheffield Hallam, Leeds Becket, Groningen and Durham are amongst those developing home pages for different kinds of user (we are experimenting with this with the South West Doctoral Training Partnership). Edinburgh University have been using Blackboard collaborate for virtual open days (we have done something similar here with Google Hangouts).
Alan and Blackboard are keen to support user groups, something I enjoyed and benefited from when working in the North East that is a bit lacking here in the South West. I am keen to see if we can get some activity going, perhaps initially involving our G4 partners.
I attended several sessions on learning analytics. Andy Ramsden (once of Bristol University, now with Blackboard) is working with JISC to look at institutional cultural and technical readiness for analytics. Derby University are experimenting with the Blackboard analytics tool, initially to interrogate activity in their online teaching division (University of Derby Online). Edinburgh have designed a student-focussed analytic tool. Students can see both their performance on tests relative to peers and online course activity (clicks) relative to peers. The information is also useful to teachers. Edinburgh are now looking at a data warehouse solution for the future which would allow much deeper analysis, presumably across a variety of systems.
The holy grails (if you can have more than one grail) are to predict student retention and student performance in order to take preventative or supportive action. From the discussions, the reality is that whilst the data is there, and can be extremely useful, it is unlikely to answer these questions directly. What it can do is help us ask more questions requiring further investigation. For example, if there is lots of activity in a particular online course, is it because it is a very active course with engaged learners or is it because information is hard to find? Does it matter if students are not using the library? What else might they be doing? What does a gap in learners’ online activity mean?
At Bristol some academics are already keen to look at data from our lecture capture system to see which parts of lectures students are watching. We can then ask (for example) whether students are looking at a particular segment of a lecture because it contains a tricky concept. For Blackboard, analytic data might help us understand the consistency of experience across the VLE – something we are asked in relation to quality audits. I am keen to learn from Cardiff University who have used something called Eesysoft to understand activity and target support at learners who need it though the VLE interface itself in the form of contextual help.
A number of institutions are integrating their student information system with Blackboard so that grade centre columns are automatically created in the Grade Centre and so that grades can be transferred back to the information system from assignments and tests once completed. This could dovetail with some of the online submission and marking work we are currently undertaking at Bristol. It could also feed into the Student Lifecycle Support Project implementation.
The conference was far from a failure, and I learned a great deal. I now need to build in time to follow up on some the lessons learnt the hard way elsewhere, and, with colleagues, continue to develop approaches that help us manage and learn as we develop and experiment with new approaches.