My thoughts on the ‘Future of Assessment and Feedback” conference – November 2021

In November 2021 I attended the Future of Assessment and Feedback conference organised by EUNIS, Geant, IMS global and JISC.

It was a two-day event covering a wide range of topics; effective practice, ways to scale up activities while maintaining standards, technical development such as in LTI ( a technical solution to link third party tools to VLEs like Blackboard) and QTI (an interoperability standard used to write Multiple Choice Questions).

Overall, I thought it was a worthwhile event featuring international speakers, subject experts  and with lots of activities to engage with, demonstrations, panel discussions and an opportunity to chat with the experts ‘in the taverna’, a virtual meet up space created on

I have collated my thoughts on this blog post to share them with colleagues, or anyone else interested in assessment and feedback practice. Comments welcome😊

As someone who has been supporting digital assessments for almost two decades, the opening talk by Gill Farrell Good assessment and feedback principles (Gill Farrell, Lisa Gray and Sarah Knight) really resonated with me;

assessment is an area traditionally stubbornly resistant to change, but the change has been forced upon us by the Pandemic

I certainly recommend the Jisc assessment and feedback programme as a good place to begin to understand the transformation of digital assessment and for developing local guidance. In fact, some of the research publications, such as ‘Transforming Assessment and feedback with technology‘ have informed our own guidance on the assessment lifecycle, while the principles devised by the REAP project (Nicol and Macfarlane-Dick (2006) are still part of our references and core resources, actually I don’t think we have a page on our website that doesn’t have a link to a JISC publication! This shows the amount of work that has gone into developing assessment and feedback practices over the last twenty years, and the impact that the research has had at local level, in our case the development of our own  University principles for assessment and feedback in taught programmes, implemented in 2015.


Following on the Jisc publications, Gill’s second talk about the assessment ecosystem looks at the EMA (Electronic Management of Assessment) work in more details. In 2014 JISC launched the EMA project, a landscape review for the UK of digital assessment, which resulted in a lot of good guidance that we have been using over the years to develop our own workflows and to scale up activities at Institutional level. I liked the “Painometer” 2014-2021, it’s a great way to show which areas of EMA staff and students were/are most dissatisfied with. The comparison also highlights how changes in requirements and policies have influenced users’ satisfaction, for example in 2014 accessibility and inclusion were problematic for 5% of respondents, but then went up to 50% in 2021, as Gill said “have we got worse? Or is it we are now more aware of these issues? Well, it must be the latter!

Building the assessment ecosystem, Gill Farrell


Ewoud De Kok, CEO and founder of Feedbackfruits, (an EdTech company founded in 2012 in the Netherlands) gave a very engaging talk (no power points slides!) about three main threats to higher education in large societies and specifically to degree qualifications offered by traditional academic Institutions.

  • Traditional colleges and University have relied for far too long on brand names as Institutions, while more and more companies are assessing people on skills rather than on CVs.
  • There are more agile and flexible learning experiences offered by private companies or as part of professional training which are more relevant and focussed than traditional University learning.
  • The amount of attention that students devote to their studies is diminishing.

What can we do about it? One thing is to keep developing the ‘learning experience’, the research on educational science and the effective use of technology, both in blended and online learning.

Moving out of the stone age of learning design, Ewoud De Kok


I couldn’t attend the talk ‘Making large class feel small’ by Danny Liu and Kimberly Baskin but I’ve listened to the recording and I thought to include it in the post because the SRES looks like a really useful tool, something to include in my horizon scanning list! The system was developed specifically to engage with students on a personalised level and help them to make them feel like they are part of a group and not just ‘lost in the crowd’.  The development of the system was underpinned by the idea that feedback is a process and not just as a one-way communication, and it needs to respond to both staff and student needs.

The system helps staff to collate, analyse and visualize data easily, it generates student personalised reports that staff can send out using a variety of communication tools. From the student perspective, the LMS integration and the personalised reports, which can include information like their preferred names, grades, feedback, have helped to increase engagement with learning activities and satisfaction. I think that having a an ‘all in one place‘ option would be an advantage to teaching staff, and I’d be interested to explore these functionalities to see if they could be an improvement to what we currently provide.

Example of uses and info on free licence agreement on the SRES (Student Relationship Engagement System) homepage.

Making large class feel small, Danny Liu & Kimberley Baskin


This talk was very timely for me because I have recently started to curate information and experiences about the Turnitin LTI 1.3 for our next development work. I thought Martin did a great job in making his presentation accessible to anyone like me who is not involved in technical architectural (not that I wish to be 😊 ). Having a high-level overview of what LTIs are and can do was extremely useful. We are already using LTIs integration for other tools and by looking at the specs it seems that LTI will be an improvement in both staff and student experience. If I had to follow up on this I’d like to find out more about the customised assessment workflows, given that our own EMA workflows have now been fully adopted I’d be interested to find out easily they can be translated.

Introduction to LTI 1.3 and LTI Advantage, Martin Lenord


These talks looked at the development of the QTI open standards to write MCQs type questions, which I have used off and on, but I’ve not had the chance to keep up with it in the last few years. I don’t think we are going to move to a systematic use of item banks which would require a standard like the QTI but we provide some support for it so it’s good to know what’s happening.

If I remember correctly when I started my role as e-assessment support in 2006 most academic staff were interested in using an independent tool to create items, possibly offline, and in sharing them with colleagues (that meant attaching a file to an email!), but then use a delivery system of choice to run the assessment. For this reason, we purchased Respondus 4.0, which I sometimes still use to import/export questions in QTI format. However, Respondus never really took off, and it was superseded by Blackboard and Questionmark.

From a technical point of view the Introduction to QTI 3.0, presented by Mark Molenaar,  was interesting because it shows the evolution timeline of the QTI standard from 2000 (1.2) to 2020 (3.0), and the new range of features that it now offers; accessibility (for example adding a glossary for non-English speakers), better customisation options, support for multimedia and interactive content, as well as integration with other systems using LTI tools like proctoring.

The other talk about QTI,  Assessing the process of thinking using QTI, showed the systematic implementation of the QTI standard for sharing item banks with different learning platforms, delivery systems, or reporting tools. The  FLIP project, a collaboration between the official education assessment bodies from four countries, is a good example of how the QTI standard has been used to share knowledge and experience in e-assessment, technology development and digital transformation.

Assessing the process of thinking using QTI, Saskia Kespaik & Franck Selles

Introduction to QTI 3.0,  Mark Molenaar

For the full list of recordings and presentations on the Eunis website

A student voice in the DEO: What have the Student Digital Champions found so far?

Since the 2019 Digital Experience Insight Survey, which revealed so much about students’ experiences of the digital learning environment at Bristol even before the pandemic, the DEO have been keen to channel student voices straight into our work. With 2020 turning out the way that it did, it was even more crucial to make that a reality, and so we worked with Bristol SU to recruit 12 Student Digital Champions (SDCs) from across all faculties in the University. They don’t have a lot of time each week with us, but they’ve definitely been making the most of that time so far!  

You can get to know them a little better by viewing this introduction video, also found on the DEO Student Digital Champion project page. 

What have they been up to? 

Since joining the DEO team, the SDCs have been actively getting out into their faculties, going to course rep meetings and faculty meetings, and talking to staff Digital Champions and other key staff. They’re reporting that even just being in meetings with their ‘digital champion’ hat on has been sparking interesting conversations with course reps and students about the student experience of digital learning in 2020 so far. 

They’ve already actively worked with us on two new DEO guides, which have been instigated from student feedback in the Pulse surveys. These are the guides on Interactivity in large sessions, and Breakout roomsThey’ve also worked to cocreate and give feedback on the Assessment Checklist and Troubleshooting guide, and other areas of the new Digitally Ready online space on assessment, which launched on 5th January to support students during this assessment period.  

What have they found?

The remit of the SDCs is to look for patterns emerging in the student experience across faculties and schools, and work together on the key themes of student engagement in learning and community building. They’ve been tasked with getting students to talk about solutions to their problems too: we want to hear ideas for what could be done differently, or what is working really well and how that could be expanded.

So far they’ve noticed…

Some of the common themes which seem to be merging across the student experience include:

The cohort conundrum  

Students are feeling disconnected, are lacking a sense of belonging and a sense of a shared experience. Many are reporting that this is partly due to other students not being active and engaged in online sessions, particularly in not turning on their videos. On the other hand, students also said they feel anxious themselves about being in online sessions, particularly breakout sessions, and in turning on their own mics and video. In the Engineering faculty, students actually felt there was an increase in engagement betweestudents when using the general discussion forum to ask questions. Students seem to be asking more questions and sharing information with each other.  

‘I don’t wanna be just a guy on the screen. I want us to be more like a cohort.’  [Year 1 Student, Centre for Innovation] 

Clarity and simplicity make good online course spaces 

Echoing student feedback in previous years, students are now more than ever keen on things being concise, clear, and easy to navigate. Videos around 20-30 minutes seem to be the maximum that students feel they can engage with, with most preferring 10-15 minutes. Our SDCs are also reporting that a messy Blackboard course space can be pretty discouraging, especially to first year students! 

Group work online is brilliant/impossible (delete as appropriate)  

We’re hearing loud and clear that the tools of online learning – shared documents, MS Teams, BB Collaborate and BB journals – are potentially great in making group work easier to manage, and coordinate. Students are getting to grips with what these systems can offer, and love the flexibility of it (when the technology allows – internet connection problems are frequently mentioned too!)But they would like more guidance on these tools, and how to use them effectively. At the same time, the lack of group identity, and the fact that they may not have actually met their peers in person, is making things difficult.  

Only few people are tuning up. How can I trust someone to do their work when we’ve never met?’ [Year 1 Student, Arts] 

And they’ve suggested…

There are already several projects in the pipeline, ideas for what might be possible, and pilots in progress. A snapshot of these include: 

A Breakout Room toolkit – A toolkit for staff, made by students, on how to plan and delivery the best breakout room experience. This is broken down by year, recognising that first years have different needs and situations than returning studentsIt includes ideas for group sizes and permanence (3-5 week rotations for groups seems popular), and establishing group identity, as well as how to encourage students to actively participate. More on this soon… 

‘Online mingle’ pilot – In partnership with the Centre for Innovation, creating a template for how to run ‘speed dating’ type welcome sessions for students, where they can get to know each other and practice speaking online in a safe and fun environment.  

Motivation Panels – Here, more experienced students are there to support first years involved in team/group work, and spark a sense of what their degree is about, and feel motivated by the subject. Led by course reps and students, this is a way to feel part of something bigger than your own unit or programme.  

Shared spaces – using tools like MS Teams to explore ways for students to meet regularly and informally. This could include news and inspiration, notices of events, a ‘Help me out’ forum, and introductions to different people within their programme or school. 

Groupwork toolkits – Deliverables to help students choose the best tools to use, and how to use them, for group work, as well as how to maximise group work as a way to meet people, and gain the sense of social interaction often missing online.  

School assembles – Regular school-wide live sessions, to give a sense of belonging and motivation across a school, rather than just within a unit or programme. These are already been run in the School of Psychological Science, and the SDCs are working to find out what it is about them which are so engaging, and how that might be replicated across the university.  


Where are we now? – notes from the reading group

For our first reading group since March, and our first ever online, we looked at any recent (post-COVID) articles on education. It was a somewhat eclectic selection but it was very good to be back together!

Moving Into the Long Term By Lilah Burke and A Renewed Focus on the Practice of Teaching by Shigeru Miyagawa and Meghan Perdue (notes by Suzi Wells) 

These two short articles reflected on the staff (and student) experience of teaching since March. 

Miyagawa and Perdue interviewed more than 30 faculty members at MIT about their experiences. The themes of their responses seem familiar from our experience in Bristol:

  • Many staff voiced an increased interest in the practice of teaching
  • Teaching has been more challenging and at times more rewarding – the crisis has forced us to come up with creative solutions to problems, which can be exciting
  • COVID has forced us to re-evaluate what is important, being unable to rely on face-to-face where we (think we) already know what works
  • Testing students online is harder and staff are questioning why and how much it is needed

A lot of what was covered in the Burke article is not surprising: students (and academics) feeling more isolated, and struggling with the difference between their expectations and where we now find ourselves. One of the people interviewed raised the point that so much has changed it will be hard to measure whether learning has suffered (or indeed improved). This seemed interesting to me and made me wonder what we can meaningfully measure, and in particular whether we can measure or evaluate what we learn from just dealing with a crisis like this.

How universities can ensure students still have a good experience, despite coronavirus (notes by Chrysanthi Tseloudi)

The article suggests 3 things universities can do to improve students’ experience during coronavirus (and in general).

  1. Listen: Survey students regularly, make changes based on the answers and communicate these to students.
  2. Communicate: via multiple channels (email is not the best for students), explain from a student’s point of view, tailored to different students.
  3. Invest: in hardware, software, networking capacity, staff training to ensure quality, consistency and innovation.

Just in time CPD by Virna Rossi (notes by Michael Marcinkowski)

This piece offered personal reflections on support strategies for helping teaching staff adapt to online teaching in the wake of COVID-19. The author highlighted the use of a staff-wide chat built into the University’s VLE and detailed the trials and tribulations of trying to answer questions posted by staff in video form. Though mostly a personal reflection on the processes, this piece did contain a number of salient details:

  1. The author tried to use video responses to questions in order to evoke a sense of being present there with teaching staff. Well being, both for staff and students was a prime concern, as evidenced by the questions and utilization of support materials related to well being, though it remains an open question whether or not the use of video in this case had its intended impact. What can be said is that the author found the process of video production to be time consuming.
  2. They also consciously used “low tech” aspects in their demonstrations of online teaching for staff which the utilized with the belief that they would make the staff feel more comfortable about making less-than-perfect resources. This included creating hand drawn slides for use in video presentations.

Overall, the article was an interesting read in the personal detail that it provided, however it had little substantive advice to build on, outside of the general claim regarding the importance of support and a concern for staff well-being. 

Designing out plagiarism for online assessment (notes by Hannah Gurr)

246 reasons to cheat: outsourcing from essay mills is a way for students to ‘quit’ without losing the qualification they were working towards. So may turn to this type of cheating due to an inability to handle academic workload or an unwillingness to do so.

HE Institutions need to know why plagiarism happens, while students need to come to understand the range of ways in which plagiarism can occur. HEIs need a developmental approach in formative assignments to help students know how to avoid plagiarism. The academic community also needs to place a positive focus on academic integrity (e.g. UoB 6 values of honesty, trust, fairness, respect, responsibility, courage), not just a negative focus on misconduct.

A Different Way to Deliver Student Feedback (lessons from the performing arts for STEM) (notes by Chrysanthi Tseloudi)

Tough-love feedback on open-ended work usually doesn’t work well. Students don’t receive it well and may feel alienated, while instructors often shift the blame to them for not being able to handle critical feedback.

The method described (based in arts, but in this article aimed at STEM) attempts to shift the dynamics and give the student power over the feedback they receive. It features 3 roles and 4 steps:

Roles: the artist (student), the responder (instructor/ student peer/ feedback giver, etc) and the facilitator (neutral party, optional).


  1. Statements of Meaning: Responders provide positive feedback about something they found meaningful, interesting, or exciting in the work.
  2. Artist as Questioner: The student asks questions about their work, focusing on the feedback they need at the moment and responders reply to these questions.
  3. Neutral Questions: Responders ask neutral questions (questions without hidden comments/ opinions) about the work, and the student responds.
  4. Opinion Time: Responders can give any other feedback they want – but only if given permission by the student. Students often don’t feel they can say no, so they will need to be reassured that they can.

Writer’s takeaway: Even if not using this method, it’s useful to ask the student what particular feedback they want at that moment. They may be surprised, as many have never been asked before. It will take them a bit of time to get used to it. But once they feel secure, tough love won’t be needed for their work to improve.

Virtual Learning Should and Can Be Hands-On (focus on labs) by Alexis R. Abramson (notes by Paddy Uglow)

Course leaders at Dartmouth College were able to keep the hands-on learning qualities of their engineering courses in the following ways:

  • $200 mini 3D printers were sent to students
  • Some lab equipment was adapted for remote operation
  • Hardware kits were sent to students containing cheap components that could be used to carry out experiments and demonstrate principals.
  • Students and staff used their imagination and home resources to replace lab-based equipment

The Reading Group discussed the article, and talked about the advantages of these methods and the use of VR video (of experiments and medical procedures). These included:

  • A real sense of “getting your hands dirty” (eg leaking chemicals, mistakes in following procedure, spillages, etc) which can’t be replicated with a computer-based version (it would be interesting to compare student performance between those learning virtually and physically – medical students practice injections on oranges, for example)
  • There’s no queuing for equipment or being unable to see properly when a demonstration is given
  • Lab experiments are often done in groups, and sometimes one person rushes ahead and doesn’t let the rest of their group gain a full understanding of what’s happening. Working at home with a kit, each student has to do it themselves, or at least gain the learning experience of why they’ve been unable to do it.

During the discussion, it was found that University of Bristol has been using similar techniques.

National Institute for Digital Learning good reads from 2019 – notes from the reading group

Do MOOCs contribute to student equity and social inclusion? A systematic review by Sarah R Lambert (read by Suzi Wells)

This study is a large literature review looking at both empirical research and practitioner guidance around using MOOCs and other open (as in free) learning to promote student equity and social inclusion. The study starts from 2014 because that’s the second wave of MOOCs, where more stable and sustainable practice begins to emerge.

The aims of the MOOCs were broken into two broad areas: those focused on student equity (making education fairer for tertiary learners and those preparing to enrol in HE) – this was the most common aim; and those who sought to improve social inclusion (education of non-award holders & lifelong learners). The literature was predominantly from US, UK, Australia, but they were only studying literature – possibly unsurprisingly for articles written in English. There was a near 50/50 split between empirical research and policy/practice recommendations, and the studies focused slightly more on MOOCs than on other other open learning. One notable finding was that the success rate (among published studies at least) was high – more often than not they met or exceeded their aims.

Lambert includes lots of useful detail about factors that may have led to successful projects. MOOC developers should learn about their learners and make content relevant and relatable to them. Successful projects often had community partners involved in the design, delivery & support – in particular, initiatives with large cohorts (~100) that were very successful all had this. Designing for the learners meant things like: designing for mobile-only and offline access, teaching people in their own language (or at least providing mother-tongue facilitation) and, where relevant, mixing practitioners with academics in the content.

Facilitating and support the learning was also key to success. Local study groups or face-to-face workshops were used by some projects to provide localisation and contextualisation. Facilitators would ideally be drawn from existing community networks.

A related point was to design content from scratch – recycling existing HE materials was not as successful. This should be done in an interdisciplinary team and/or community partnership. Being driven entirely by an IT or digital education team was an indicator that a project would not meet its aims. Projects need technical expertise but education and/or widening participation too. Open as is free-to-use is fine, licence didn’t seem to have an impact.

In short:

  • Work with the people you intend to benefit.
  • Create, don’t recycle.
  • Don’t expect the materials to stand by themselves.

If you’re interested in social justice through open learning, think OU not OERs.

What does the ‘Postdigital’ mean for education? Three critical perspectives on the digital, with implications for educational research and practice by Jeremy Knox (read by Suzanne Collins)

This article explores the idea of what ‘post-digital’ education means, specifically thinking about human-technology relationships. It begins with an analysis of the term post-digital’, embracing the perspective of ‘post’ as a critical appraisal of the understanding of digital rather than simply meaning a different stage, after, the digital. This initial analysis is worth a read, but not my main focus for this reading group so here I’ll jump straight to the main discussion, which is based on three critical perspectives on digital in education.

The first is “Digital as Capital”. Here, Knox talks about the commercialisation and capitalisation of digital platforms, such as social media. This platform model is increasingly based on the commodification of data, and so inevitably students/teachers/learners become seen as something which can be analysed (eg learning analytics), or something under surveillance. If surveillance is equated to recognition, this leads to further (perhaps troubling?) implications. Do you need to be seen to be counted as a learner? Is learning always visible? Does this move away from the idea of the web and digital being ‘social constructivist’?

Secondly, Knox looks at “Digital as Policy”. This (for me) slightly more familiar ground discusses the idea that ‘digital’ education is no longer as separate or distinct from ‘education’ as it once was. In a ‘post-digital’ understanding, it is in fact mainstream rather than alternative or progressive. The digital in education, however, often manifests as metrification in governance – eg schools are searchable in rankings based on algorithms. In this sense, ‘digital education’ moves away from ‘classroom gadgets’ (as Knox puts it) and sees it as something intrinsic and embedded in policy, with strategic influence.

Lastly, he discusses “Digital as Material”, which focuses on surfacing the hidden material dimensions of a sector which was often seen as ‘virtual’ and therefore ‘intangible’. The tangible, material aspects of digital education include devices, servers, and other physical elements which require manual labour and material resources. On one hand there is efficiency, but on the other there is always labour. As education, particularly digital education, often comes from a sense of social egalitarianism and social justice, this is a troubling realisation, and one which lead to a rethink in the way digital education is positioned in a post-digital perspective.

In conclusion, Knox suggests that ‘post-digital’ should be understood as a ‘holding to account of the many assumptions associated with digital technology’, which I feel sums up his argument and is probably something we should try and do more of regardless of whether we’re a ‘digital’ or ‘post-digital’ education office.

What’s the problem with learning analytics? by Neil Selwyn (read by Michael Marcinkowski)

For this last session I read Neil Selwyn’s ‘What’s the Problem with Learning Analytics’ from the Journal of Learning Analytics. Appearing in a journal published by the Society for Learning Analytics Research, Selwyn’s socio-technical approach toward the analysis of learning analytics was a welcome, if somewhat predictable, take on a field that too often seems to find itself somewhat myopically digging for solution to its own narrow set of questions.

Setting learning analytics within a larger social, cultural, and economic field of analysis, Selwyn lays out an orderly account of a number of critical concerns, organized around the implications and values present in learning analytics.

Selwyn lists these consequences of learning analytics as areas to be questioned:

  1. A reduced understanding of education: instead of a holistic view of education it is reduced to a simple numerical metric.
  2. Ignoring the broader social contexts of education: there is a danger that by limiting the understanding of education that we ignore important contextual factors affecting education.
  3. Reducing students’ and teachers’ capacity for informed decision-making: the results of learning analytics comes to overtake other types of decision making.
  4. A means of surveillance rather than support: in their use, learning analytics can have more punitive rather than pedagogical implications.
  5. A source of performativity: students and teachers each begin to focus on achieving results that can be measured by analytics rather than other measures of learning.
  6. Disadvantaging a large number of people: like any data driven system, decisions about winners and losers can be unintentionally baked into the system.
  7. Servicing institutional rather than individual interests: the analytics has more direct benefit for educational institutions and analytic providers than it does for students.

He goes on to list several questionable values embedded in learning analytics:

  1. A blind faith in data: There is a risk that there is a contemporary over-valuation of the importance of data.
  2. Concerns over the data economy: What are the implications when student data is monetized by companies?
  3. The limits of free choice and individual agency: Does a reliance on analytic data remove the ability of students and educators to have a say in their education?
  4. An implicit techno-idealism: Part of leaning analytics is a belief in the benefits of the impacts of technology.

Toward this, Selwyn proposes a few broad areas for change designed to improve learning analytics’ standing within a wider field of concern:

  1. Rethink the design of learning analytics: allow for more transparency and customization for students.
  2. Rethink the economics of learning analytics: give students ownership of their data.
  3. Rethink the governance of learning analytics: establish regulatory oversite for student data.
  4. A better public understanding of learning analytics: educate the wider public of the ethical implications of the application of learning analytics to student data.

Overall, Selwyn’s main point remains the most valuable: the idea of learning analytics should be examined within the full constellation of social and cultural structures within which it is embedded. Like any form of data analytics, learning analytics does not exist as a perfect guide to any action, and the insights that are generated by it need to be understood as only partial and implicated by the mechanisms designed to generate the data. In the end, Selwyn’s account is a helpful one — it is useful to have such critical voices welcomed into SOLAR — but the level at which he casts his recommendations remains too broad for anything other than a starting point. Setting clear policy goals and fostering a broad understanding of learning analytics are hopeful external changes that can be made to the context within which learning analytics is used, but in the end, what is necessary is for those working in the field of learning analytics who are constructing systems of data generation and analysis to alter the approaches that they take, both in the ‘ownership’ and interpretation of student data. This enforces the need for how we understand what ‘data’ is and how we think about using it to change. Following Selwyn, the most important change might be to re-evaluate the ontological constitution of data and our connection to it, coming to understand it not as something distinct from students’ education, but an integral part of it.

Valuing technology-enhanced academic conferences for continuing professional development. A systematic literature. Professional Development in Education by Maria Spilker (read by Naomi Beckett )

This literature review gives an analysis of the different approaches taken to enhance academic conferences technologically for continued professional development. Although there have been advances and new practices emerging, a definite coherent approach was lacking. Conferences were being evaluated in specific ways that were not considering all sides.

‘Continued professional development for academics is critical in times of increased speed and innovation, and this intensifies the responsibilities of academics.’ 

This makes it more important to ensure when academics come together at a conference, there is a systematic approach to look at what they should be getting out of the time spent there. The paper suggests this is something that needs to be looked out when first starting to plan a conference, what are the values?

The paper talks about developing different learning experiences at a conference to engage staff and build their professional development. There is often little time for reflection and the paper suggests looking at more ways to include this. Using technology is an example of a way this could be done. Engagement on Twitter for example gives users another channel to discuss and network, and this takes them away from the normal traditional conference formats.  Making more conferences online also gives users the opportunities to reach out to further networks.

The paper mentions their Value Creation Network, looking at what values we should be taking out of conferences. These include, immediate value, potential value, applied value, realised value, and re-framing value. Looking at these to begin with is a good start to thinking about how we can frame academic conferences, so delegates get the most use out of the time spent there, and work on their own professional development too.

We asked teenagers what adults are missing about technology. This was the best response by Taylor Fang (read by Paddy Uglow)

Some thoughts I took away:

  • Traditionally a “screen” was a thing to hide or protect, and to aid privacy. Now it’s very much the opposite.
  • Has society changed so much that social media is the only place that young people can express themselves and build a picture of who they are and what their place is in the world?
  • Adults have a duty to help young people find other ways to show who they are to the world (and the subsequent reflection back to themself)
  • Digital = data = monetisation: everything young people do online goes into a money-making system which doesn’t have their benefit as its primary goal.
  • Young people are growing up in a world where their importance and value is quantified by stats, likes, shares etc, and all these numbers demonstrate to them that they’re less important than other people, and encourages desperate measures to improve their metrics.
  • Does a meal/holiday/party/etc really exist unless it’s been published and Liked?
  • Does the same apply to Learning Analytics? Are some of the most useful learning experiences those which don’t have a definition or a grade?

Suggested reading

Digital Accessibility and Neurodiversity

Dafydd presenting to lecture theatreLast week we hosted the third of our Digital Accessibility events, this time with Dafydd Henke-Reed, Senior Accessibility Consultant with AbilityNet. Dafydd has been diagnosed with Autism and Dyslexia and spoke about his personal experiences of Neurodiversity.

Dafydd was engaging and open about his experiences growing up, going to University and the technology he uses day to day. From the very start he highlighted that Autism is a spectrum and that we were hearing what Neurodiversity means to him.

From Cognitive Brick Walls to being horrified when friendly lecturers asked him to move forward from the back row of a lecture theatre, we heard about the barriers and obstacles he had faced.

What stood out for me


“Dyslexia could be solved with tools; Autism was about learning how to thrive in a seemingly hostile culture.”

Dafydd had refused support related to Autism at University. Tactics such as large yellow “appropriate allowance when marking” stickers felt like a brand. This is pertinent; many students may not disclose their “disabilities” due to previous experience or because they find allowances intrusive or counterproductive. In fact, with conditions such as Autism Spectrum Disorder may not consider it a disability in the first case, it’s just the way they are. If we are to be truly inclusive, then we need to design our learning experience to remove barriers and everyone benefits.

“Come over for group study and we’ll get beers and Pizza in? Hell no!”

Dafydd spoke about how he found groups and teamwork challenging. He’ll use digital tools like Slack or instant messaging to communicate rather than walking to a colleague’s desk. He also praised electronic tickets (“I won’t lose them”)

He showed us the Speech to Text (STT) and Text to Speech (TTS) systems he uses every day along with the spelling correction functionality.

“Not good enough”

Dafydd will obsess about making things perfect. From essays with over 20 drafts to repeatedly painting his bathroom wall until a relative intervened to say it was fine, he needed regular feedback to help get past the compulsion to improve something.

Do’s and Don’ts


The excellent UK Gov “Do’s and Don’ts” guides were given a name check again, this time for Dyslexia and Autism. If you haven’t seen them, check out these lovely visual guide posters. I think they should be printed out in every office!

Government Designing for disability guides

We have one more session with AbilityNet left on the 5th February looking at Physical Impairment, still a handful of tickets left.

Curriculum design – notes from the reading group

Exploring curriculum design approaches (report by Suzanne Collins)

Suzanne talked about her work exploring curriculum design approaches (separate blog post) where she looks at looks at methodologies such as ABC, Carpe Diem, CAIeRO and ELDeR.

ABC Learning Design (notes by Suzi Wells)

ABC learning design is a rapid design / review methodology developed by Clive Young and Nataša Perović in 2014, and drawing on Laurillard’s ‘Conversational Framework’.

The method is centered around 90 minute workshop, during which participants:

  • Describe their unit in a tweet
  • Map out the types of activity, currently undertaken or planned, against Laurillard’s six learning types
  • Storyboard the unit using the ABC cards

In the DEO a few of us – Roger Gardner, Suzanne Collins, and I – have trialled this approach. Initially this was with a small number of academics interested in redesigning their credit-bearing units. We made much fuller use of it when supporting the design of the FutureLearn courses, following which Suzanne and I presented on this at a UCL conference in 2018: our presentation on using ABC at Bristol.

One advantage of the methodology is that you could run a single 90 minute workshop looking at an entire programme, allowing potential links between the units to become apparent. The short length of the workshop gives at least some chance to get everyone from the unit together in one place.

The cards are Creative Commons licensed and have been widely adapted, adding in activities and terminology that is more relevant to their context. On the ABC site you can download ABC cards for MOOCs designed for FutureLearn and EdX. At the conference we heard how people have used stickers in the storyboarding stage to surface things that they are interested in: employability skills, alignment with the education strategy, and university-wide themes (such as Bristol’s global citizenship, innovation & enterprise, and sustainable futures themes).

Obviously a 90 minute workshop is not going to give you time to finalise many details but ABC is quick to learn, very adaptable, and sparks good conversations. It’s remarkable how much can be done in a short time.

Beyond podcasting: creative approaches to designing educational audio (notes by Chrysanthi Tseloudi)

This paper talks about a pilot that aimed to encourage academics to use podcasts in their teaching through a tool in their VLE. The pilot included initial workshops and various types of support for the 25 participants that decided to try it out. All participants used audio, apart from 1 team, which used video podcasts. 9 of them shared their experience for this paper. They had produced various types of resources: videos about clinical techniques (nursing), audio based on research projects which also received audio feedback from the academic (sport), “questions you’re afraid to ask” (art & design), answers to distance learning students’ questions to reduce the sense of isolation in the VLE (communications), etc.

Academics enjoyed using audio for learner-centred pedagogies, but they also encountered some barriers. Expectations for high quality may be a barrier for both staff and students, while assessing student work in this format is time consuming. Not being familiar with the technology can be frustrating and impeding for staff, as they would rather not ask students to do something they didn’t feel confident they could do well themselves. Students are not necessarily more confident than them in using this technology. Following the pilot, the institution’s capacity to support such activities was evaluated and some solutions to support staff were devised.

This was a nice paper with a variety of ideas on using audio for teaching. I found the point about voice on the VLE increasing connectivity and reducing isolation particularly interesting, and would love to see any relevant research on this.

Suggested reading

Approaches to curriculum design

Related articles

Exploring Curriculum Design Approaches

Recently at the University of Bristol, we’ve all been thinking a lot about learning design, developing curriculum and ways of assessment. BILT’s focus on TESTA for transforming assessment is one way you can see this in action. In higher education, learning design can quickly get complicated – for example redesigning a whole programme – and is increasingly new and exciting – with online or blended aspects, new assessment methods or innovative pedagogies. A method of working when approaching curriculum, programme or learning design can speed up the process and make it much more enjoyable for everyone involved. Helpfully, there are several working methods based on story-boarding which provide a way to navigate this process, and which focus on a team approach to designing learning.

The Digital Education Office have mainly used an approach based on UCL’s ABC: you can read more about our use of this method from a series of blog posts by Suzi Wells and I on a previous ABC conference held at UCL. 

Such curriculum design approaches all facilitate discussion and evaluation of current and future learning designs by bringing together relevant stakeholders, learning design specialists and support staff. In the Sway presentation embedded here, we’ll have a quick look at a few, in order to get a taste of what these approaches involve, and how they’ve been used by others. Follow this link to open the Sway in a new tab or window.

Accessibility and mental health

The second in our series of talks from AbilityNet was from Adam, Service Development Manager, on accessibility and mental health. Adam spoke both from his professional and personal experience, and from his in-depth knowledge of technology. The session was fascinating and very useful. Some highlights for me included…

Helping to understand the complexity of the issues

Adam talked about training, as a runner, as a way of understanding what physical physical pain you can push through and what you can’t – and the idea that the same holds for mental stresses and strains. Some you can push through and some you can’t – and they will be different for different people. He also talked about the idea that there is an increase of perfectionism in younger generations, be that self-improvement, down to social pressure, or outward facing (expecting more of others).

Do’s and don’ts

The accessibility posters produced by GOV.UK are a great set of resources, and people have been adding their own. Adam showed the posters on designing for users with anxiety which would be a really useful checklist for a number of our services

Technology tips

There were lots of great recommendations of apps and tools. The ones that stood out for me were:

  • Word can now check for clarity, conciseness and inclusiveness (for example, unnecessarily gendered language)
  • Presenter Coach which comes free with PowerPoint online and allows you to rehearse your presentation to an AI audience could be useful both to improve your own clarity and to give students a non-threatening way to rehearse
  • Text-to-speech tools are great for proofreading (this is a revelation to me) and also for getting an unemotional reading of emails that have been sent to you
  • Forest app rewards you with your own virtual woodland for spending time away from your phone

Finally, as a back-to-paper fan, I love the idea of Google’s printable phone.

What next?

This was another great session with AbilityNet, the two remaining sessions are:

We’ll be releasing some ‘Top Tips’ videos for each strand after the event. We’ll also try to make recordings of the sessions available.

If you would like to talk to the Digital Education Office team about Digital Accessibility, Blackboard Ally or just have related questions do feel free to contact us via:

Tel: +44 (0)117 42 83055 / internal: 83055

Notes from the reading group – free choice

Digital wellbeing toolkit from BBC R&D (notes from Suzi Wells)

This is a toolkit from BBC R&D is designed for addressing wellbeing in digital product development. Given the increasing focus on student wellbeing, I was interested in whether it could be useful for online & blended learning.

The key resources provided are a set of values cards along with some exploration of what these mean, particularly for young people (16-34 year olds). These were developed by the BBC through desk research, focus groups and surveys.

The toolkit – the flashcards especially – could certainly be used to sense-check whether very digital teaching methods are really including and supporting the kinds of things we might take for granted in face-to-face education. It could also be useful for working with students to discuss the kinds of things that are important to them in our context, and identifying the kinds of things that really aren’t.

Some of the values seem too obvious (“being safe and well”, “receiving recognition”), or baked-in to education (“achieving goals”, “growing myself”), and I worry that could be off-putting to some audiences. The language also seemed a little strange “human values” – as though “humans” were an alien species. It can feel like the target audience for the more descriptive parts would be someone who has never met a “human”, much less a young one. Nonetheless, the flashcards in particular could be a useful way to kick off discussions.

Three example flash cards showing the values: being inspired, expressing myself, having autonomy

New studies show the cost of student laptop use in lecture classes (notes from Michael Marcinkowski)

The article I read for this session highlighted two new-ish articles which studied the impact that student laptop use had on learning during lectures. In the roughly ten years that laptops have been a common sight in lecture halls, a number of studies have looked at what impact they have on notetaking during class. These previous studies have frequently found a negative association with laptop use for notetaking in lectures, not only for the student using the laptop, but also for other students sitting nearby, distracted by the laptop screen.

The article took a look at two new studies that attempted to tackle some of the limitations of previous work, particularly addressing the correlative nature of previous findings: perhaps low performing students prefer to use laptops for notetaking so that they can do something else during lectures.

What bears mentioning is that there is something somewhat quaint about studying student laptop use. In most cases, it seems to be a foregone conclusion and there is no getting it back into the box. Students will use laptops and other digital technologies in class — there’s almost no other option at this point. Nevertheless, the studies proved interesting.

The first of the highlighted studies featured an experimental set up, randomly assigning students in different sections of an economics class to different conditions: notetaking with laptop, without laptop, or with tablet laying flat on the desk. The last condition was designed to test the effect of students’ being distracted by seeing other students’ screens; the supposition being that if the tablet was laid flat on a desk, it wouldn’t be visible to other students. The students’ performance was then measured based on a final exam taken by students across the three conditions.

After controlling for demographics, GPA, and ACT entrance exam scores, the research found that performance was lower for students using digital technologies for notetaking. However, while performance was lower on the multiple choice and short answer sections of the exam, performance on the essay potion of the exam was the same across all three conditions.

While the study did address some shortcomings of previous studies (particularly with its randomized experimental design), it also introduced several others. Importantly it raised questions about how teachers might teach differently when faced with a class of laptop users or what effect forcing a student who isn’t comfortable using a laptop might have on their performance. Also, given that multiple sections of an economics class was the subject of the study, what role does the discipline being lectured on play in the impact of laptop use?

The second study attempted to address those though a novel design which linked students’ propensity to use or not use laptops in optional-use classes based on whether or not they were forced to or forced not to use them in another class on the same day. Researchers looked at institution-wide student performance at an institution that had a mix of classes which required, forbade, or had no rules about laptop use.

By looking at student performance in classes in which laptop use was optional, but by linking that performance to whether students would be influenced in their laptop choices based on other classes held the same day, researchers wanted to be able to measure student performance when they had a chance not to use a laptop in class. That is, the design allowed researchers to understand in general how many students might be using a laptop in a laptop-optional class, but still allowing individual students to make a choice based on preference.

What they found was that student performance worsened for classes that shared a day with laptop mandated classes and improved on days when classes were shared with laptop prohibited classes. This is in line with previous studies, but interestingly, the negative effects were seen more strongly in weaker students and in quantitative classes.

In the end, even while these two new studies reinforce what had been previously demonstrated about student laptop use, is there anything that can be done to counter what seem to be the negative effects of laptop use for notetaking? More than anything, what seems to be needed are studies looking at how to boost student performance when using technology in the classroom.

StudyGotchi: Tamagotchi-Like Game-Mechanics to Motivate Students During a Programming Course & To Gamify or Not to Gamify: Towards Developing Design Guidelines for Mobile Language Learning Applications to Support User Experience; 2 poster/ demo papers in the EC-TEL 2019 Proceedings. (notes from Chrysanthi Tseloudi)

Both papers talk about the authors’ findings on gamified applications related to learning.

The first regards the app StudyGotchi, based on Tamagochi (a virtual pet the user takes care of), which aims to encourage first year Java programming students to complete tasks on their Moodle platform in order to keep a virtual teacher happy. Less than half the students downloaded the relevant app, with half of those receiving the control version that didn’t have the game functions (180) and half receiving the game version (194). According to their survey, of those that didn’t download it, some reported that was because they weren’t interested in it. Of those that replied to whether they used it, less than half said they did. According to data collected from students that used either version of the app, there was no difference in either the online behaviour or the exam grades of the students, between the groups that used the game or non-game versions. The authors attribute this to the lack of interaction, personalisation options and immediate feedback on the students’ actions on Moodle. I also wonder whether a virtual teacher to be made happy in particular is the best choice of “pet”, when hopefully there is already a real teacher supporting students’ learning. Maybe a virtual brain with regions that light up when a quiz is completed or any non-realistic representation connected to the students’ own development would be more helpful in increasing students’ intrinsic motivation, since ideally they would be learning for themselves, and not to make someone else happy.

The second paper compares 2 language learning apps, one of which is gamified. The non-gamified app (LearnIT ASAP) includes exercises where students fill in missing words, feedback to show whether the answer is correct/ incorrect and statistics to track progress. The gamified app (Starfighter) includes exercises where the students steer through an asteroid field by selecting answers to given exercises and a leaderboard to track progress and compete with peers. The evaluation involved interviewing 11 20-50 year old individuals. The authors found that younger and older participants had different views about the types of interactions and aesthetics of the two apps. Younger participants would have preferred swiping to tapping, older participants seemed to find the non-gamified app comfortable because it looked like a webpage, but were not so sure about the gamified app. The game mechanics of Starfighter were thought to be more engaging, while the pedagogical approach of LearnIT ASAP was thought to be better in terms of instructional value and effectiveness. While the authors mention that the main difference between the apps is gamification, considering the finding that the pedagogical approach of one of the apps is better, I wonder if that is actually the case. Which game elements actually improve engagement and how is still being researched, so I would really like to see comparisons of language learning apps where the existence or not of game elements is indeed the only difference. Using different pedagogical approaches between the apps is less likely to shed light on the best game elements to use, but it does emphasize the difficulty of creating an application that is both educationally valuable and fun at the same time.

Digital Accessibility and Sight Impairment

Adi Latif from AbilityNet presenting on Sight Impairment.Last week we hosted the first of four Digital Accessibility sessions with AbilityNet, the UK charity supporting those with impairments or disability to use digital technology.

The first session focused on Sight Impairment and was presented by Adi Latif, an accessibility consultant with AbilityNet. Adi lost his sight in his teens as a result of a degenerative eye condition. Adi gave us a potted history of his experiences going into Higher Education and how technology had evolved and helped his journey since. From clunky archaic looking speaking watches to the Uber app, he painted a picture of the difficulties he’d had historically accessing the digital world and how he now uses tools to navigate both real life and online spaces.

It was humbling seeing him demonstrate how he uses assistive technology like screen readers or mobile phone apps to make sense of the world. The fact he was continuously slowing down the audio playback for these tools so we could understand the feedback really struck me. My experience of using screen reading tools to check over digital content I’ve created has been painful at best, with content read back at real time speeds. Adi appeared to be reading back at double time, if not faster.

Adi demonstrated some of the regular pitfalls sight impaired users come across accessing documents and covered some best practice to improve accessibility. He also covered just how awful an experience using a PDF file can be, essentially saying “Actually, if I hear a hyperlink say it’s a PDF I probably won’t open it”. After years of advising people to include a PDF it was slightly horrifying to learn that they are so inaccessible, and an alternative version should be included.

That makes sense when you consider that PDF is a format created for print, so is primarily concerned with creating an exact copy of the source material. I’ll certainly amend my practice on this front.

Adi discussed various ways to improve accessibility, from the Microsoft Accessibility Checker tool to Blackboard Ally.

This was a great first session with AbilityNet, we have three more to go focusing on:

We’ll be releasing some ‘Top Tips’ videos for each strand after the event. We’ll also try to make recordings of the sessions available.

If you would like to talk to the Digital Education Office team about Digital Accessibility, Blackboard Ally or just have related questions do feel free to contact us via:

Tel: +44 (0)117 42 83055 / internal: 83055