FutureLearn Academic Network 24th October

The FutureLearn Academic Network event at the University of Leicester  provided a chance to catch up with Futurelearn developments, including:

  • Intended audiences for courses: Several institutions are aiming FutureLearn courses at their own students and/or wanting to track impact of publically available courses on their students.  Professional CPD is another growth area. There are debates to be had about the relative merits of CPD courses being closed or open to a wider audience.
  • Course evaluation and analytics: There were some wonderful presentations on how data extracted from courses can be used to understand learning taking place in FutureLearn courses. Sylvia Gallagher’s research at Trinity College Dublin uses visualisations of FutureLearn discussion to identify discursive themes and analyse whether comments are ‘on task’ (related to the intended learning outcomes). Sylvia is also evaluating the impact of infographics summarising each week of the course. Ben Fields from FutureLearn is exploring time on task; the amount of time that elapses between a learner first starting a step and marking it as complete. This could appear as a report in the platform to aid educator in the future. Southampton University are using data to try to predict dropouts and comparing stated weekly learning time with the actual time learners spend on platform.
  • Futurelearn and relevant external tools: Futurelearn are ready for a further roll out the long awaited group tools. Early indications are they can work for courses where the design is heavily structured towards group work – food for thought for Bristol Futures course design. An external tool called  Georama may also have application for Bristol Futures. It is an  Immersive technology that could give a learner some feeling of engagement with a live event from a distance, for example a field trip. As with the group function, the purpose would need to be clearly thought through. Would a live experience be of enough benefit to enough learners to make using this worthwhile?
  • Methods of accrediting and verifying learning: Professor Mike Sharples (Open University) outlined some future looking research using ‘Blockchain’ technology. Blockchain was developed to verify bitcoins, but is now being piloted to verify learning achievement. Could this become a more mature version of Open badge technology? See The Blockchain and Kudos: A Distributed System for Educational Record, Reputation and Reward. OU experiments include using within an ePortfolio.

Schools and eLearning – Education ICT 2016 and visit to Microsoft

Last week I attended two events in London that gave a flavour of eLearning and the school sector. The first event was a conference entitled Education ICT 2016, the second was a visit to see education experts at Microsoft, who are doing a lot with schools and increasingly with Universities. Things are changing fast in schools, particularly with the use of tablets by students. We can learn from what is happening in the sector, and there is interest from schools in what we are doing in HE.

Education ICT Conference 2016

Pete Herbert and I presented at the Education ICT Conference in Westminster On Wednesday 29th June. We had the tough job of following an excellent presentation from  Dr Neelam Parmar, Director of Elearning at Ashford School. Neelam described her engagement in with staff to identify pedagogic approaches and develop workflows for a variety of apps used in class on tablet devices. Many of these apps are free and could be of use in HE.

Pete and I spoke about scaling up the digitisation of content through Mediasite and our aspirations to move beyond simply capturing content to doing something more transformative. Pete illustrated the scale of use of Mediasite at Bristol, which has had over a million views, and also described how academics here are:

  • using analytic data to determine the areas students return to in the recordings to ask questions about why students might focus on those elements, eg is there a concept they are trying to better understand?
  • using flipped techniques and video feedback. In other words, changing teaching practice through the technology.

We alluded to aspirations to partner with students in areas of course and material design and how we are learning from MOOCs to change what we deliver to our own students. I was then on a panel session with some challenging questions from the floor about how we engage staff and students in change, and how students can partner with us in making change happen. Coincidentally, one of the other panel members, Kevin Sait, Head of IT Strategy at Wymondham High Academy Trust, delivered part of the session I attended at Microsoft on Friday.

Visit to Microsoft

This was an opportunity to see what Microsoft are developing for the education market. The visit was arranged and attended by colleagues from IT Services. Colleagues from the Faculty of Health Sciences.

We enjoyed a demonstration of the Microsoft Surface Hub. In effect, this is a very advanced electronic whiteboard with powerful video conference functionality built in. The responsiveness of the touch screens in particular was impressive. This has been the main disadvantage of screens I have used in the past. The video conferencing (built on Skype) included Xbox technology that tracks the user to determine which camera to use. You can see that in the right sized classroom, and with the right use cases, this could be an extremely effective tool. They could, for example, support those teaching across the clinical academies.

Kevin Sait demonstrated a range of Microsoft collaboration tools built into Office 365 and Sharepoint. Of particular interest to one colleague was Sway (part of Office 365) billed as a digital storytelling tool. Much of the collaboration with students in Microsoft schools centres on Onenote, through which students can build and share content. Other colleagues could see huge potential of the cloud for collaborative staff activity eg collaboration on exam papers.

There are some differences between Schools and Universities (for example, class size and types of teaching space) but there is much we can learn from what they are doing in schools. University student expectations will evolve as a result of what they are seeing in schools. We can start experimenting with tools like Onenote and the office 365 package, which, like Google apps, have great potential for both staff and student collaborative activity.

16th Durham Blackboard Users Conference 6th-8th January 2016

failures2blue-300x293Congratulations to the team at Durham for putting on another excellent conference. For me, this is the most useful event of the year for those of us with the task of supporting Blackboard. It is truly a ‘user’ led conference. The agenda is set by users and attendees are open and willing to share.

It was harder to pick out specific themes and emerging trends than at previous conferences. The conference title ‘Learning from Failure’ can be interpreted in different ways (helpful for those submitting papers but resulted in lots of different topics covered). Learning from mistakes is something we know we should all should do, as students, teachers and institutions. Google’s approaches to ‘failing faster and smaller’ come to mind, as does Kolb’s learning cycle and the ‘validated learning’ approach used in lean design methodologies. For me, the key is to manage mistakes by limiting the impact, whilst creating a culture in which (whilst we try to get things right first time) we accept that this will not always happen and learn from the process. Failing is often part of success, providing lessons are learnt and changes made. When I was working with academics to create MOOC materials, the materials we  had to recreate several times after testing turned out to be the best bits of those courses.

Eric Stoller’s keynote on social media touched on the failure theme by suggesting we are likely to fail (and learn) as each new technology comes along. In doing so we (and our students) develop our digital capabilities. For example, we  learn more about identity and risk in these new spaces. A takeaway message for me is that whilst some spaces are very much at the social end of the spectrum, where student learning will happen, usually without our intervention,  we could encourage students to use other tools  more. For example Linkedin is increasingly a tool of choice for recruitment.  The age demographic for linkedin is 40+. Should we be encouraging students to sign up to these tools? Eric praised Linkedin Pulse, the  publishing facility in Linkedin, and mentioned that the company had recently acquired Lynda.com (a very successful online learning materials provider).

Alan Masson from Blackboard (previously the University of Ulster) highlighted a change in focus to a more mature embedding of TEL. VLEs are now business critical, and need appropriate robustness with 24-7 availability. Institutions are looking at deeper integration with student information systems. Alongside this, Universities are looking at personalisation and re-use of tools for things outside of core teaching. Sheffield Hallam, Leeds Becket, Groningen and Durham are amongst those developing home pages for different kinds of user (we are experimenting with this with the South West Doctoral Training Partnership). Edinburgh University have been using Blackboard collaborate for virtual open days (we have done something similar here with Google Hangouts).

Alan and Blackboard are keen to support user groups, something I enjoyed and benefited from  when working in the North East that is a bit lacking here in the South West. I am keen to see if we can get some activity going, perhaps initially involving our G4 partners.

I attended several sessions on learning analytics.  Andy Ramsden (once of Bristol University, now with Blackboard) is working with JISC to look at institutional cultural and technical readiness for analytics. Derby University  are experimenting with the Blackboard analytics tool, initially to interrogate activity in their online teaching division (University of Derby Online). Edinburgh have designed  a student-focussed analytic tool.  Students can see both their performance on tests relative to peers and online course activity (clicks) relative to peers. The information is also useful to teachers. Edinburgh are now looking at a data warehouse solution for the future which would allow much deeper analysis, presumably across a variety of systems.

The holy grails (if you can have more than one grail) are to predict student retention and  student performance in order to take preventative or supportive action. From the discussions, the reality is that whilst the data is there, and can be extremely useful, it is unlikely to answer these questions directly. What it can do is help us ask more questions requiring further investigation.  For example, if there is lots of  activity in a particular online course, is it because it is a very active course with engaged learners or is it because information is hard to find? Does it matter if students are not using the library? What else might they be doing? What does a gap in learners’ online activity mean?

At Bristol some academics are already keen to look at data from our lecture capture system to see which parts of lectures students are watching. We can then ask (for example) whether students are looking at a particular segment of a lecture because it contains a tricky concept.  For Blackboard, analytic data might help us understand the consistency of experience across the VLE – something we are asked in relation to quality audits. I am keen to learn from Cardiff University who have used something called Eesysoft to understand activity and target support at learners who need it though the VLE interface itself in the form of contextual help.

A number of institutions are integrating their student information system with Blackboard so that grade centre columns are automatically created in the Grade Centre and so that grades can be transferred back to the information system from assignments and tests once completed. This could dovetail with some of the online submission and marking work we are currently undertaking at Bristol. It could also feed into the Student Lifecycle Support Project implementation.

The conference was far from a failure, and I learned a great deal. I now need to build in time to follow up on some the lessons learnt the hard way elsewhere, and, with colleagues, continue to develop approaches that help us manage and learn as we develop and experiment with new approaches.