My thoughts on the ‘Future of Assessment and Feedback” conference – November 2021

In November 2021 I attended the Future of Assessment and Feedback conference organised by EUNIS, Geant, IMS global and JISC.

It was a two-day event covering a wide range of topics; effective practice, ways to scale up activities while maintaining standards, technical development such as in LTI ( a technical solution to link third party tools to VLEs like Blackboard) and QTI (an interoperability standard used to write Multiple Choice Questions).

Overall, I thought it was a worthwhile event featuring international speakers, subject experts  and with lots of activities to engage with, demonstrations, panel discussions and an opportunity to chat with the experts ‘in the taverna’, a virtual meet up space created on Gather.town.

I have collated my thoughts on this blog post to share them with colleagues, or anyone else interested in assessment and feedback practice. Comments welcome😊

As someone who has been supporting digital assessments for almost two decades, the opening talk by Gill Farrell Good assessment and feedback principles (Gill Farrell, Lisa Gray and Sarah Knight) really resonated with me;

assessment is an area traditionally stubbornly resistant to change, but the change has been forced upon us by the Pandemic

I certainly recommend the Jisc assessment and feedback programme as a good place to begin to understand the transformation of digital assessment and for developing local guidance. In fact, some of the research publications, such as ‘Transforming Assessment and feedback with technology‘ have informed our own guidance on the assessment lifecycle, while the principles devised by the REAP project (Nicol and Macfarlane-Dick (2006) are still part of our references and core resources, actually I don’t think we have a page on our website that doesn’t have a link to a JISC publication! This shows the amount of work that has gone into developing assessment and feedback practices over the last twenty years, and the impact that the research has had at local level, in our case the development of our own  University principles for assessment and feedback in taught programmes, implemented in 2015.

BUILDING THE ASSESSMENT ECOSYSTEM

Following on the Jisc publications, Gill’s second talk about the assessment ecosystem looks at the EMA (Electronic Management of Assessment) work in more details. In 2014 JISC launched the EMA project, a landscape review for the UK of digital assessment, which resulted in a lot of good guidance that we have been using over the years to develop our own workflows and to scale up activities at Institutional level. I liked the “Painometer” 2014-2021, it’s a great way to show which areas of EMA staff and students were/are most dissatisfied with. The comparison also highlights how changes in requirements and policies have influenced users’ satisfaction, for example in 2014 accessibility and inclusion were problematic for 5% of respondents, but then went up to 50% in 2021, as Gill said “have we got worse? Or is it we are now more aware of these issues? Well, it must be the latter!

Building the assessment ecosystem, Gill Farrell

MOVING OUT OF THE STONE AGE OF LEARNING DESIGN (Keynote speaker)

Ewoud De Kok, CEO and founder of Feedbackfruits, (an EdTech company founded in 2012 in the Netherlands) gave a very engaging talk (no power points slides!) about three main threats to higher education in large societies and specifically to degree qualifications offered by traditional academic Institutions.

  • Traditional colleges and University have relied for far too long on brand names as Institutions, while more and more companies are assessing people on skills rather than on CVs.
  • There are more agile and flexible learning experiences offered by private companies or as part of professional training which are more relevant and focussed than traditional University learning.
  • The amount of attention that students devote to their studies is diminishing.

What can we do about it? One thing is to keep developing the ‘learning experience’, the research on educational science and the effective use of technology, both in blended and online learning.

Moving out of the stone age of learning design, Ewoud De Kok

MAKING LARGE CLASS FEEL SMALL

I couldn’t attend the talk ‘Making large class feel small’ by Danny Liu and Kimberly Baskin but I’ve listened to the recording and I thought to include it in the post because the SRES looks like a really useful tool, something to include in my horizon scanning list! The system was developed specifically to engage with students on a personalised level and help them to make them feel like they are part of a group and not just ‘lost in the crowd’.  The development of the system was underpinned by the idea that feedback is a process and not just as a one-way communication, and it needs to respond to both staff and student needs.

The system helps staff to collate, analyse and visualize data easily, it generates student personalised reports that staff can send out using a variety of communication tools. From the student perspective, the LMS integration and the personalised reports, which can include information like their preferred names, grades, feedback, have helped to increase engagement with learning activities and satisfaction. I think that having a an ‘all in one place‘ option would be an advantage to teaching staff, and I’d be interested to explore these functionalities to see if they could be an improvement to what we currently provide.

Example of uses and info on free licence agreement on the SRES (Student Relationship Engagement System) homepage.

Making large class feel small, Danny Liu & Kimberley Baskin

INTRODUCTION TO LTI 1.3 AND LTI ADVANTAGE

This talk was very timely for me because I have recently started to curate information and experiences about the Turnitin LTI 1.3 for our next development work. I thought Martin did a great job in making his presentation accessible to anyone like me who is not involved in technical architectural (not that I wish to be 😊 ). Having a high-level overview of what LTIs are and can do was extremely useful. We are already using LTIs integration for other tools and by looking at the specs it seems that LTI will be an improvement in both staff and student experience. If I had to follow up on this I’d like to find out more about the customised assessment workflows, given that our own EMA workflows have now been fully adopted I’d be interested to find out easily they can be translated.

Introduction to LTI 1.3 and LTI Advantage, Martin Lenord

WHAT’S NEW IN QTI 3.0

These talks looked at the development of the QTI open standards to write MCQs type questions, which I have used off and on, but I’ve not had the chance to keep up with it in the last few years. I don’t think we are going to move to a systematic use of item banks which would require a standard like the QTI but we provide some support for it so it’s good to know what’s happening.

If I remember correctly when I started my role as e-assessment support in 2006 most academic staff were interested in using an independent tool to create items, possibly offline, and in sharing them with colleagues (that meant attaching a file to an email!), but then use a delivery system of choice to run the assessment. For this reason, we purchased Respondus 4.0, which I sometimes still use to import/export questions in QTI format. However, Respondus never really took off, and it was superseded by Blackboard and Questionmark.

From a technical point of view the Introduction to QTI 3.0, presented by Mark Molenaar,  was interesting because it shows the evolution timeline of the QTI standard from 2000 (1.2) to 2020 (3.0), and the new range of features that it now offers; accessibility (for example adding a glossary for non-English speakers), better customisation options, support for multimedia and interactive content, as well as integration with other systems using LTI tools like proctoring.

The other talk about QTI,  Assessing the process of thinking using QTI, showed the systematic implementation of the QTI standard for sharing item banks with different learning platforms, delivery systems, or reporting tools. The  FLIP project, a collaboration between the official education assessment bodies from four countries, is a good example of how the QTI standard has been used to share knowledge and experience in e-assessment, technology development and digital transformation.

Assessing the process of thinking using QTI, Saskia Kespaik & Franck Selles

Introduction to QTI 3.0,  Mark Molenaar

For the full list of recordings and presentations on the Eunis website