My thoughts on the ‘Future of Assessment and Feedback” conference – November 2021

In November 2021 I attended the Future of Assessment and Feedback conference organised by EUNIS, Geant, IMS global and JISC.

It was a two-day event covering a wide range of topics; effective practice, ways to scale up activities while maintaining standards, technical development such as in LTI ( a technical solution to link third party tools to VLEs like Blackboard) and QTI (an interoperability standard used to write Multiple Choice Questions).

Overall, I thought it was a worthwhile event featuring international speakers, subject experts  and with lots of activities to engage with, demonstrations, panel discussions and an opportunity to chat with the experts ‘in the taverna’, a virtual meet up space created on Gather.town.

I have collated my thoughts on this blog post to share them with colleagues, or anyone else interested in assessment and feedback practice. Comments welcome😊

As someone who has been supporting digital assessments for almost two decades, the opening talk by Gill Farrell Good assessment and feedback principles (Gill Farrell, Lisa Gray and Sarah Knight) really resonated with me;

assessment is an area traditionally stubbornly resistant to change, but the change has been forced upon us by the Pandemic

I certainly recommend the Jisc assessment and feedback programme as a good place to begin to understand the transformation of digital assessment and for developing local guidance. In fact, some of the research publications, such as ‘Transforming Assessment and feedback with technology‘ have informed our own guidance on the assessment lifecycle, while the principles devised by the REAP project (Nicol and Macfarlane-Dick (2006) are still part of our references and core resources, actually I don’t think we have a page on our website that doesn’t have a link to a JISC publication! This shows the amount of work that has gone into developing assessment and feedback practices over the last twenty years, and the impact that the research has had at local level, in our case the development of our own  University principles for assessment and feedback in taught programmes, implemented in 2015.

BUILDING THE ASSESSMENT ECOSYSTEM

Following on the Jisc publications, Gill’s second talk about the assessment ecosystem looks at the EMA (Electronic Management of Assessment) work in more details. In 2014 JISC launched the EMA project, a landscape review for the UK of digital assessment, which resulted in a lot of good guidance that we have been using over the years to develop our own workflows and to scale up activities at Institutional level. I liked the “Painometer” 2014-2021, it’s a great way to show which areas of EMA staff and students were/are most dissatisfied with. The comparison also highlights how changes in requirements and policies have influenced users’ satisfaction, for example in 2014 accessibility and inclusion were problematic for 5% of respondents, but then went up to 50% in 2021, as Gill said “have we got worse? Or is it we are now more aware of these issues? Well, it must be the latter!

Building the assessment ecosystem, Gill Farrell

MOVING OUT OF THE STONE AGE OF LEARNING DESIGN (Keynote speaker)

Ewoud De Kok, CEO and founder of Feedbackfruits, (an EdTech company founded in 2012 in the Netherlands) gave a very engaging talk (no power points slides!) about three main threats to higher education in large societies and specifically to degree qualifications offered by traditional academic Institutions.

  • Traditional colleges and University have relied for far too long on brand names as Institutions, while more and more companies are assessing people on skills rather than on CVs.
  • There are more agile and flexible learning experiences offered by private companies or as part of professional training which are more relevant and focussed than traditional University learning.
  • The amount of attention that students devote to their studies is diminishing.

What can we do about it? One thing is to keep developing the ‘learning experience’, the research on educational science and the effective use of technology, both in blended and online learning.

Moving out of the stone age of learning design, Ewoud De Kok

MAKING LARGE CLASS FEEL SMALL

I couldn’t attend the talk ‘Making large class feel small’ by Danny Liu and Kimberly Baskin but I’ve listened to the recording and I thought to include it in the post because the SRES looks like a really useful tool, something to include in my horizon scanning list! The system was developed specifically to engage with students on a personalised level and help them to make them feel like they are part of a group and not just ‘lost in the crowd’.  The development of the system was underpinned by the idea that feedback is a process and not just as a one-way communication, and it needs to respond to both staff and student needs.

The system helps staff to collate, analyse and visualize data easily, it generates student personalised reports that staff can send out using a variety of communication tools. From the student perspective, the LMS integration and the personalised reports, which can include information like their preferred names, grades, feedback, have helped to increase engagement with learning activities and satisfaction. I think that having a an ‘all in one place‘ option would be an advantage to teaching staff, and I’d be interested to explore these functionalities to see if they could be an improvement to what we currently provide.

Example of uses and info on free licence agreement on the SRES (Student Relationship Engagement System) homepage.

Making large class feel small, Danny Liu & Kimberley Baskin

INTRODUCTION TO LTI 1.3 AND LTI ADVANTAGE

This talk was very timely for me because I have recently started to curate information and experiences about the Turnitin LTI 1.3 for our next development work. I thought Martin did a great job in making his presentation accessible to anyone like me who is not involved in technical architectural (not that I wish to be 😊 ). Having a high-level overview of what LTIs are and can do was extremely useful. We are already using LTIs integration for other tools and by looking at the specs it seems that LTI will be an improvement in both staff and student experience. If I had to follow up on this I’d like to find out more about the customised assessment workflows, given that our own EMA workflows have now been fully adopted I’d be interested to find out easily they can be translated.

Introduction to LTI 1.3 and LTI Advantage, Martin Lenord

WHAT’S NEW IN QTI 3.0

These talks looked at the development of the QTI open standards to write MCQs type questions, which I have used off and on, but I’ve not had the chance to keep up with it in the last few years. I don’t think we are going to move to a systematic use of item banks which would require a standard like the QTI but we provide some support for it so it’s good to know what’s happening.

If I remember correctly when I started my role as e-assessment support in 2006 most academic staff were interested in using an independent tool to create items, possibly offline, and in sharing them with colleagues (that meant attaching a file to an email!), but then use a delivery system of choice to run the assessment. For this reason, we purchased Respondus 4.0, which I sometimes still use to import/export questions in QTI format. However, Respondus never really took off, and it was superseded by Blackboard and Questionmark.

From a technical point of view the Introduction to QTI 3.0, presented by Mark Molenaar,  was interesting because it shows the evolution timeline of the QTI standard from 2000 (1.2) to 2020 (3.0), and the new range of features that it now offers; accessibility (for example adding a glossary for non-English speakers), better customisation options, support for multimedia and interactive content, as well as integration with other systems using LTI tools like proctoring.

The other talk about QTI,  Assessing the process of thinking using QTI, showed the systematic implementation of the QTI standard for sharing item banks with different learning platforms, delivery systems, or reporting tools. The  FLIP project, a collaboration between the official education assessment bodies from four countries, is a good example of how the QTI standard has been used to share knowledge and experience in e-assessment, technology development and digital transformation.

Assessing the process of thinking using QTI, Saskia Kespaik & Franck Selles

Introduction to QTI 3.0,  Mark Molenaar

For the full list of recordings and presentations on the Eunis website

Formative language activities using technology’, a free event organised by the School of Modern Languages and the Centre for English and Foundation Studies

As a language teacher I am always interested in what other colleagues do around assessment and feedback practice so on the 19th of January I attended a free seminar organised by CELFS (Centre for English and Foundation Studies) and SML (School of Modern Languages) on ‘Formative language activities using technology’.

The seminar focused on strategies for engaging students with formative and summative feedback using a range of technologies both in and outside the language classroom.

I took away lots of good ideas but also a couple of questions that remain unanswered. First, are we now more inclined to the idea that best practice may require the use of multiple technologies rather that one solution for all, and second, how can make the environment seamless to our students? and what about accessibility requirements?

my notes on the event

Engaging students with feedback. I know I did not come to our feedback appointment but could you tell me what my mark is?’ Emilie Poletto’s first slide showing a teacher snowed under a huge mountain of paper is a great illustration of the issue; most of the time students tend to concentrate on the end product rather than on their learning process but it is up to us to change this says Emilie ‘we need to change the role of the student from a consumer approach to a partnership’.
So the big question is ‘What strategies can we use to rethink the way we give students formative feedback? it clearly requires more than a new shiny piece of technology. Maggie Boswell says the change must be driven by the learning process not the means of delivery ‘Some might argue for the use of technology to mark student work while others might argue for traditional methodologies. How student engage with their feedback and make subsequent progress is at the heart of my student-driven ongoing enquiry.

Here are a few tips that teachers shared with the audience

  • work with students on assessment criteria and engage them in collaborative learning activities. Give them the opportunity to identify their strengths and weaknesses and to own a plan for improving their competences.
  • ask students to identify specific features for formative feedback so that you can target both the quality and the amount of feedback you provide
  • use personalised feedback, eg video through Mediasite or any screencasting solution
  • use a variety of feedback formats, written, audio and video
  • provide student support throughout the whole process, they may not need help with using the technology but with the orientation, for example finding where they have to go to look at the feedback 

a bit more from some of the individual presentations

Maggie Boswell uses a combination of different feedback formats such as drop in corrective written and voice comments, and a range of technologies like Turnitin Grademark and Mediasite. Turnitin Grademark allows her to annotate essays using both the a reusable comment bank and voice recording features, while Mediasite desktop recorder allows her to create screencast and add audio feedback.

With this combination of methods she provides feedback during TB1 over a twelve-week period on essay redraft and final draft. A couple of tips from Maggie on voice feedback; first, students engage more with this type of feedback because they hear a familiar voice, second, it is important to use the right tone and elaborate on some of the negative comments so that students don’t worry too much about a mistake that may be less serious than they might think. ‘I really like the video feedback. At first when I saw ‘omit’ (grademark drop in written comment) I thought it was really bad, but when I watched the video, I realised it was not such a bad error because of the intonation’. (from student survey collected via Google forms).

Emilie Poletto’s presentation ‘Thanks for the feedback, but what is my mark?” How to help students engage with feedback, was the one I liked most as it goes straight to the point, we spend lots of time providing formative feedback and then realise that students completely ignore it and only focus on the final mark. What can we do about this?

Emilie’s approach, inspired by the work of Alex Forsythe & Sophie Johnson as well as the work of other colleagues in the SML, focuses on ‘feedback action plans’ and student motivation.  Each student gets an individual action plan  to record specific areas of their learning that are routinely discussed with the teacher during individual tutorial. The action plan puts the onus on the students to devise their own strategies, critically evaluate the feedback they are given, build on their strengths and address their weaknesses. Students may not be used to do all of this at first but that they are more likely to engage if they feel they are in charge of the process and get good support from their teacher. Grades are only discussed at a later stage, in fact Emilie doesn’t give students their marks until they have completed the action plan which means students really have to focus on their learning first.   

In terms of working with multiple technologies I liked Jana Nahodilová’s presentation about the use of Blackboard, Quizlet and Xerte: the best parts of all of them to support assessment and feedback. Her approach for providing formative assessment is built on three main areas; Ongoing multi-phase daily process that takes place through teacher-pupil interaction, providing feedback for immediate action (for student and teacher) and reflecting on how to modify teaching activities to improve learning (motivation) and results.
For each one of these tools Jana has identified both advantages and considerations from a teacher’s perspective. Advantages include ‘easy to use and interactive’, ‘great for monitoring’, and ‘wide range of possible activities’, while some consideration are ‘little flexibility’, ‘complex set up’ and ‘lack of the functionalities required’.

More on the range of technologies on show

Blackboard assessment engine available within Blackboard and fully supported at UoB

Xerte online tutorial tool with a range of functionalities for assessment and feedback and fully supported at UoB

Quizlet  free online learning tool particularly used for flashcards to support vocabulary learning

Mediasite fully supported UoB lecture capture tool with a range of functionalities for editing videos and screencasting  

Turniting Grademark, fully supported at UoB grading tool with a variety of functionalities for automated written feedback and voice feedback

Google forms free and easy to use quiz tool available from individual google accounts

Sonocent an audio note taking software with a wide array of functionalities for feedback and assessment such as visual annotations of text and audio

Many thanks to the presenters for sharing their work:

Maggie Boswell, English teacher (CELFS)
Emilie Poletto, French teacher (SML)
Jana Nahodilová, Czech teacher (SML)