The Impact of AI on Learning Design – notes from the Reading Group

by Maxine Sims

This weeks topic was Using AI starting with the Digital Education Institutes Ebook on The Impact of AI on Learning Design. The group also discussed other AI tools they have experimented with and different thoughts and opinions on where it could be useful or risky in education.

Some key takeaways:

  • AI can be really useful for inspiration and ‘tidying up’ your thoughts and learning content plans.
  • Tools are emerging and changing all the time. They can produce content at a faster and cheaper rate than traditional methods, particularly video production tools.
  • There are great opportunities to level the accessibility of learning by supporting the effort from thought to writing for groups that find this challenging.
  • There is a fear around the uncertainty of it’s uses, and some scepticism of it’s value – particularly when it comes to the suggestion that some traditional jobs may become redundant!
  • Like all new technology it is disruptive and calls into question the value and purpose of education, knowledge and skills. What can we gain and what do we lose are questions that are still debated.
  • Some universities have made progress with creating AI working groups to produce T&L guidance and training and support for staff. For example Sheffield and Oxford.
  • There are potential uses for AI to free up staff time (a common complaint) to focus on more ‘meaningful’ work if we can find a comfortable way forward with what data we share and what tasks can be handled by the tech.

Innovating Pedagogy 2024 – notes from the reading group

This week we revived our long-running reading group, partly inspired by London’s Digital Education Reading Group. Our first topic was Innovating Pedagogy 2024 from the Open University. We each read and gave a brief summary of one section, with a little time for questions and discussion. Plenty of food for thought – a really enjoyable and useful conversation. Some recurring themes:

  • Potentials of generative AI
  • Unknowns, risks and costs of AI
  • Inclusion and personalisation
  • Exclusion caused by lack of access to technology or the skills to engage successfully
  • Creating emotionally safe practice spaces – chatbots for rubber duck debugging

We’re running it online to make it easier for people to join. Though it worked well, it definitely feels like it makes free-flowing discussion slower and more difficult. Looking forward to future meetings and seeing how the format evolves over time.

My thoughts on the ‘Future of Assessment and Feedback” conference – November 2021

In November 2021 I attended the Future of Assessment and Feedback conference organised by EUNIS, Geant, IMS global and JISC.

It was a two-day event covering a wide range of topics; effective practice, ways to scale up activities while maintaining standards, technical development such as in LTI ( a technical solution to link third party tools to VLEs like Blackboard) and QTI (an interoperability standard used to write Multiple Choice Questions).

Overall, I thought it was a worthwhile event featuring international speakers, subject experts  and with lots of activities to engage with, demonstrations, panel discussions and an opportunity to chat with the experts ‘in the taverna’, a virtual meet up space created on Gather.town.

I have collated my thoughts on this blog post to share them with colleagues, or anyone else interested in assessment and feedback practice. Comments welcome😊

As someone who has been supporting digital assessments for almost two decades, the opening talk by Gill Farrell Good assessment and feedback principles (Gill Farrell, Lisa Gray and Sarah Knight) really resonated with me;

assessment is an area traditionally stubbornly resistant to change, but the change has been forced upon us by the Pandemic

I certainly recommend the Jisc assessment and feedback programme as a good place to begin to understand the transformation of digital assessment and for developing local guidance. In fact, some of the research publications, such as ‘Transforming Assessment and feedback with technology‘ have informed our own guidance on the assessment lifecycle, while the principles devised by the REAP project (Nicol and Macfarlane-Dick (2006) are still part of our references and core resources, actually I don’t think we have a page on our website that doesn’t have a link to a JISC publication! This shows the amount of work that has gone into developing assessment and feedback practices over the last twenty years, and the impact that the research has had at local level, in our case the development of our own  University principles for assessment and feedback in taught programmes, implemented in 2015.

BUILDING THE ASSESSMENT ECOSYSTEM

Following on the Jisc publications, Gill’s second talk about the assessment ecosystem looks at the EMA (Electronic Management of Assessment) work in more details. In 2014 JISC launched the EMA project, a landscape review for the UK of digital assessment, which resulted in a lot of good guidance that we have been using over the years to develop our own workflows and to scale up activities at Institutional level. I liked the “Painometer” 2014-2021, it’s a great way to show which areas of EMA staff and students were/are most dissatisfied with. The comparison also highlights how changes in requirements and policies have influenced users’ satisfaction, for example in 2014 accessibility and inclusion were problematic for 5% of respondents, but then went up to 50% in 2021, as Gill said “have we got worse? Or is it we are now more aware of these issues? Well, it must be the latter!

Building the assessment ecosystem, Gill Farrell

MOVING OUT OF THE STONE AGE OF LEARNING DESIGN (Keynote speaker)

Ewoud De Kok, CEO and founder of Feedbackfruits, (an EdTech company founded in 2012 in the Netherlands) gave a very engaging talk (no power points slides!) about three main threats to higher education in large societies and specifically to degree qualifications offered by traditional academic Institutions.

  • Traditional colleges and University have relied for far too long on brand names as Institutions, while more and more companies are assessing people on skills rather than on CVs.
  • There are more agile and flexible learning experiences offered by private companies or as part of professional training which are more relevant and focussed than traditional University learning.
  • The amount of attention that students devote to their studies is diminishing.

What can we do about it? One thing is to keep developing the ‘learning experience’, the research on educational science and the effective use of technology, both in blended and online learning.

Moving out of the stone age of learning design, Ewoud De Kok

MAKING LARGE CLASS FEEL SMALL

I couldn’t attend the talk ‘Making large class feel small’ by Danny Liu and Kimberly Baskin but I’ve listened to the recording and I thought to include it in the post because the SRES looks like a really useful tool, something to include in my horizon scanning list! The system was developed specifically to engage with students on a personalised level and help them to make them feel like they are part of a group and not just ‘lost in the crowd’.  The development of the system was underpinned by the idea that feedback is a process and not just as a one-way communication, and it needs to respond to both staff and student needs.

The system helps staff to collate, analyse and visualize data easily, it generates student personalised reports that staff can send out using a variety of communication tools. From the student perspective, the LMS integration and the personalised reports, which can include information like their preferred names, grades, feedback, have helped to increase engagement with learning activities and satisfaction. I think that having a an ‘all in one place‘ option would be an advantage to teaching staff, and I’d be interested to explore these functionalities to see if they could be an improvement to what we currently provide.

Example of uses and info on free licence agreement on the SRES (Student Relationship Engagement System) homepage.

Making large class feel small, Danny Liu & Kimberley Baskin

INTRODUCTION TO LTI 1.3 AND LTI ADVANTAGE

This talk was very timely for me because I have recently started to curate information and experiences about the Turnitin LTI 1.3 for our next development work. I thought Martin did a great job in making his presentation accessible to anyone like me who is not involved in technical architectural (not that I wish to be 😊 ). Having a high-level overview of what LTIs are and can do was extremely useful. We are already using LTIs integration for other tools and by looking at the specs it seems that LTI will be an improvement in both staff and student experience. If I had to follow up on this I’d like to find out more about the customised assessment workflows, given that our own EMA workflows have now been fully adopted I’d be interested to find out easily they can be translated.

Introduction to LTI 1.3 and LTI Advantage, Martin Lenord

WHAT’S NEW IN QTI 3.0

These talks looked at the development of the QTI open standards to write MCQs type questions, which I have used off and on, but I’ve not had the chance to keep up with it in the last few years. I don’t think we are going to move to a systematic use of item banks which would require a standard like the QTI but we provide some support for it so it’s good to know what’s happening.

If I remember correctly when I started my role as e-assessment support in 2006 most academic staff were interested in using an independent tool to create items, possibly offline, and in sharing them with colleagues (that meant attaching a file to an email!), but then use a delivery system of choice to run the assessment. For this reason, we purchased Respondus 4.0, which I sometimes still use to import/export questions in QTI format. However, Respondus never really took off, and it was superseded by Blackboard and Questionmark.

From a technical point of view the Introduction to QTI 3.0, presented by Mark Molenaar,  was interesting because it shows the evolution timeline of the QTI standard from 2000 (1.2) to 2020 (3.0), and the new range of features that it now offers; accessibility (for example adding a glossary for non-English speakers), better customisation options, support for multimedia and interactive content, as well as integration with other systems using LTI tools like proctoring.

The other talk about QTI,  Assessing the process of thinking using QTI, showed the systematic implementation of the QTI standard for sharing item banks with different learning platforms, delivery systems, or reporting tools. The  FLIP project, a collaboration between the official education assessment bodies from four countries, is a good example of how the QTI standard has been used to share knowledge and experience in e-assessment, technology development and digital transformation.

Assessing the process of thinking using QTI, Saskia Kespaik & Franck Selles

Introduction to QTI 3.0,  Mark Molenaar

For the full list of recordings and presentations on the Eunis website

A student voice in the DEO: What have the Student Digital Champions found so far?

Since the 2019 Digital Experience Insight Survey, which revealed so much about students’ experiences of the digital learning environment at Bristol even before the pandemic, the DEO have been keen to channel student voices straight into our work. With 2020 turning out the way that it did, it was even more crucial to make that a reality, and so we worked with Bristol SU to recruit 12 Student Digital Champions (SDCs) from across all faculties in the University. They don’t have a lot of time each week with us, but they’ve definitely been making the most of that time so far!  

You can get to know them a little better by viewing this introduction video, also found on the DEO Student Digital Champion project page. 

What have they been up to? 

Since joining the DEO team, the SDCs have been actively getting out into their faculties, going to course rep meetings and faculty meetings, and talking to staff Digital Champions and other key staff. They’re reporting that even just being in meetings with their ‘digital champion’ hat on has been sparking interesting conversations with course reps and students about the student experience of digital learning in 2020 so far. 

They’ve already actively worked with us on two new DEO guides, which have been instigated from student feedback in the Pulse surveys. These are the guides on Interactivity in large sessions, and Breakout roomsThey’ve also worked to cocreate and give feedback on the Assessment Checklist and Troubleshooting guide, and other areas of the new Digitally Ready online space on assessment, which launched on 5th January to support students during this assessment period.  

What have they found?

The remit of the SDCs is to look for patterns emerging in the student experience across faculties and schools, and work together on the key themes of student engagement in learning and community building. They’ve been tasked with getting students to talk about solutions to their problems too: we want to hear ideas for what could be done differently, or what is working really well and how that could be expanded.

So far they’ve noticed…

Some of the common themes which seem to be merging across the student experience include:

The cohort conundrum  

Students are feeling disconnected, are lacking a sense of belonging and a sense of a shared experience. Many are reporting that this is partly due to other students not being active and engaged in online sessions, particularly in not turning on their videos. On the other hand, students also said they feel anxious themselves about being in online sessions, particularly breakout sessions, and in turning on their own mics and video. In the Engineering faculty, students actually felt there was an increase in engagement betweestudents when using the general discussion forum to ask questions. Students seem to be asking more questions and sharing information with each other.  

‘I don’t wanna be just a guy on the screen. I want us to be more like a cohort.’  [Year 1 Student, Centre for Innovation] 

Clarity and simplicity make good online course spaces 

Echoing student feedback in previous years, students are now more than ever keen on things being concise, clear, and easy to navigate. Videos around 20-30 minutes seem to be the maximum that students feel they can engage with, with most preferring 10-15 minutes. Our SDCs are also reporting that a messy Blackboard course space can be pretty discouraging, especially to first year students! 

Group work online is brilliant/impossible (delete as appropriate)  

We’re hearing loud and clear that the tools of online learning – shared documents, MS Teams, BB Collaborate and BB journals – are potentially great in making group work easier to manage, and coordinate. Students are getting to grips with what these systems can offer, and love the flexibility of it (when the technology allows – internet connection problems are frequently mentioned too!)But they would like more guidance on these tools, and how to use them effectively. At the same time, the lack of group identity, and the fact that they may not have actually met their peers in person, is making things difficult.  

Only few people are tuning up. How can I trust someone to do their work when we’ve never met?’ [Year 1 Student, Arts] 

And they’ve suggested…

There are already several projects in the pipeline, ideas for what might be possible, and pilots in progress. A snapshot of these include: 

A Breakout Room toolkit – A toolkit for staff, made by students, on how to plan and delivery the best breakout room experience. This is broken down by year, recognising that first years have different needs and situations than returning studentsIt includes ideas for group sizes and permanence (3-5 week rotations for groups seems popular), and establishing group identity, as well as how to encourage students to actively participate. More on this soon… 

‘Online mingle’ pilot – In partnership with the Centre for Innovation, creating a template for how to run ‘speed dating’ type welcome sessions for students, where they can get to know each other and practice speaking online in a safe and fun environment.  

Motivation Panels – Here, more experienced students are there to support first years involved in team/group work, and spark a sense of what their degree is about, and feel motivated by the subject. Led by course reps and students, this is a way to feel part of something bigger than your own unit or programme.  

Shared spaces – using tools like MS Teams to explore ways for students to meet regularly and informally. This could include news and inspiration, notices of events, a ‘Help me out’ forum, and introductions to different people within their programme or school. 

Groupwork toolkits – Deliverables to help students choose the best tools to use, and how to use them, for group work, as well as how to maximise group work as a way to meet people, and gain the sense of social interaction often missing online.  

School assembles – Regular school-wide live sessions, to give a sense of belonging and motivation across a school, rather than just within a unit or programme. These are already been run in the School of Psychological Science, and the SDCs are working to find out what it is about them which are so engaging, and how that might be replicated across the university.  


 

Where are we now? – notes from the reading group

For our first reading group since March, and our first ever online, we looked at any recent (post-COVID) articles on education. It was a somewhat eclectic selection but it was very good to be back together!

Moving Into the Long Term By Lilah Burke and A Renewed Focus on the Practice of Teaching by Shigeru Miyagawa and Meghan Perdue (notes by Suzi Wells) 

These two short articles reflected on the staff (and student) experience of teaching since March. 

Miyagawa and Perdue interviewed more than 30 faculty members at MIT about their experiences. The themes of their responses seem familiar from our experience in Bristol:

  • Many staff voiced an increased interest in the practice of teaching
  • Teaching has been more challenging and at times more rewarding – the crisis has forced us to come up with creative solutions to problems, which can be exciting
  • COVID has forced us to re-evaluate what is important, being unable to rely on face-to-face where we (think we) already know what works
  • Testing students online is harder and staff are questioning why and how much it is needed

A lot of what was covered in the Burke article is not surprising: students (and academics) feeling more isolated, and struggling with the difference between their expectations and where we now find ourselves. One of the people interviewed raised the point that so much has changed it will be hard to measure whether learning has suffered (or indeed improved). This seemed interesting to me and made me wonder what we can meaningfully measure, and in particular whether we can measure or evaluate what we learn from just dealing with a crisis like this.

How universities can ensure students still have a good experience, despite coronavirus (notes by Chrysanthi Tseloudi)

The article suggests 3 things universities can do to improve students’ experience during coronavirus (and in general).

  1. Listen: Survey students regularly, make changes based on the answers and communicate these to students.
  2. Communicate: via multiple channels (email is not the best for students), explain from a student’s point of view, tailored to different students.
  3. Invest: in hardware, software, networking capacity, staff training to ensure quality, consistency and innovation.

Just in time CPD by Virna Rossi (notes by Michael Marcinkowski)

This piece offered personal reflections on support strategies for helping teaching staff adapt to online teaching in the wake of COVID-19. The author highlighted the use of a staff-wide chat built into the University’s VLE and detailed the trials and tribulations of trying to answer questions posted by staff in video form. Though mostly a personal reflection on the processes, this piece did contain a number of salient details:

  1. The author tried to use video responses to questions in order to evoke a sense of being present there with teaching staff. Well being, both for staff and students was a prime concern, as evidenced by the questions and utilization of support materials related to well being, though it remains an open question whether or not the use of video in this case had its intended impact. What can be said is that the author found the process of video production to be time consuming.
  2. They also consciously used “low tech” aspects in their demonstrations of online teaching for staff which the utilized with the belief that they would make the staff feel more comfortable about making less-than-perfect resources. This included creating hand drawn slides for use in video presentations.

Overall, the article was an interesting read in the personal detail that it provided, however it had little substantive advice to build on, outside of the general claim regarding the importance of support and a concern for staff well-being. 

Designing out plagiarism for online assessment (notes by Hannah Gurr)

246 reasons to cheat: outsourcing from essay mills is a way for students to ‘quit’ without losing the qualification they were working towards. So may turn to this type of cheating due to an inability to handle academic workload or an unwillingness to do so.

HE Institutions need to know why plagiarism happens, while students need to come to understand the range of ways in which plagiarism can occur. HEIs need a developmental approach in formative assignments to help students know how to avoid plagiarism. The academic community also needs to place a positive focus on academic integrity (e.g. UoB 6 values of honesty, trust, fairness, respect, responsibility, courage), not just a negative focus on misconduct.

A Different Way to Deliver Student Feedback (lessons from the performing arts for STEM) (notes by Chrysanthi Tseloudi)

Tough-love feedback on open-ended work usually doesn’t work well. Students don’t receive it well and may feel alienated, while instructors often shift the blame to them for not being able to handle critical feedback.

The method described (based in arts, but in this article aimed at STEM) attempts to shift the dynamics and give the student power over the feedback they receive. It features 3 roles and 4 steps:

Roles: the artist (student), the responder (instructor/ student peer/ feedback giver, etc) and the facilitator (neutral party, optional).

Steps: 

  1. Statements of Meaning: Responders provide positive feedback about something they found meaningful, interesting, or exciting in the work.
  2. Artist as Questioner: The student asks questions about their work, focusing on the feedback they need at the moment and responders reply to these questions.
  3. Neutral Questions: Responders ask neutral questions (questions without hidden comments/ opinions) about the work, and the student responds.
  4. Opinion Time: Responders can give any other feedback they want – but only if given permission by the student. Students often don’t feel they can say no, so they will need to be reassured that they can.

Writer’s takeaway: Even if not using this method, it’s useful to ask the student what particular feedback they want at that moment. They may be surprised, as many have never been asked before. It will take them a bit of time to get used to it. But once they feel secure, tough love won’t be needed for their work to improve.

Virtual Learning Should and Can Be Hands-On (focus on labs) by Alexis R. Abramson (notes by Paddy Uglow)

Course leaders at Dartmouth College were able to keep the hands-on learning qualities of their engineering courses in the following ways:

  • $200 mini 3D printers were sent to students
  • Some lab equipment was adapted for remote operation
  • Hardware kits were sent to students containing cheap components that could be used to carry out experiments and demonstrate principals.
  • Students and staff used their imagination and home resources to replace lab-based equipment

The Reading Group discussed the article, and talked about the advantages of these methods and the use of VR video (of experiments and medical procedures). These included:

  • A real sense of “getting your hands dirty” (eg leaking chemicals, mistakes in following procedure, spillages, etc) which can’t be replicated with a computer-based version (it would be interesting to compare student performance between those learning virtually and physically – medical students practice injections on oranges, for example)
  • There’s no queuing for equipment or being unable to see properly when a demonstration is given
  • Lab experiments are often done in groups, and sometimes one person rushes ahead and doesn’t let the rest of their group gain a full understanding of what’s happening. Working at home with a kit, each student has to do it themselves, or at least gain the learning experience of why they’ve been unable to do it.

During the discussion, it was found that University of Bristol has been using similar techniques.

National Institute for Digital Learning good reads from 2019 – notes from the reading group

Do MOOCs contribute to student equity and social inclusion? A systematic review by Sarah R Lambert (read by Suzi Wells)

This study is a large literature review looking at both empirical research and practitioner guidance around using MOOCs and other open (as in free) learning to promote student equity and social inclusion. The study starts from 2014 because that’s the second wave of MOOCs, where more stable and sustainable practice begins to emerge.

The aims of the MOOCs were broken into two broad areas: those focused on student equity (making education fairer for tertiary learners and those preparing to enrol in HE) – this was the most common aim; and those who sought to improve social inclusion (education of non-award holders & lifelong learners). The literature was predominantly from US, UK, Australia, but they were only studying literature – possibly unsurprisingly for articles written in English. There was a near 50/50 split between empirical research and policy/practice recommendations, and the studies focused slightly more on MOOCs than on other other open learning. One notable finding was that the success rate (among published studies at least) was high – more often than not they met or exceeded their aims.

Lambert includes lots of useful detail about factors that may have led to successful projects. MOOC developers should learn about their learners and make content relevant and relatable to them. Successful projects often had community partners involved in the design, delivery & support – in particular, initiatives with large cohorts (~100) that were very successful all had this. Designing for the learners meant things like: designing for mobile-only and offline access, teaching people in their own language (or at least providing mother-tongue facilitation) and, where relevant, mixing practitioners with academics in the content.

Facilitating and support the learning was also key to success. Local study groups or face-to-face workshops were used by some projects to provide localisation and contextualisation. Facilitators would ideally be drawn from existing community networks.

A related point was to design content from scratch – recycling existing HE materials was not as successful. This should be done in an interdisciplinary team and/or community partnership. Being driven entirely by an IT or digital education team was an indicator that a project would not meet its aims. Projects need technical expertise but education and/or widening participation too. Open as is free-to-use is fine, licence didn’t seem to have an impact.

In short:

  • Work with the people you intend to benefit.
  • Create, don’t recycle.
  • Don’t expect the materials to stand by themselves.

If you’re interested in social justice through open learning, think OU not OERs.

What does the ‘Postdigital’ mean for education? Three critical perspectives on the digital, with implications for educational research and practice by Jeremy Knox (read by Suzanne Collins)

This article explores the idea of what ‘post-digital’ education means, specifically thinking about human-technology relationships. It begins with an analysis of the term post-digital’, embracing the perspective of ‘post’ as a critical appraisal of the understanding of digital rather than simply meaning a different stage, after, the digital. This initial analysis is worth a read, but not my main focus for this reading group so here I’ll jump straight to the main discussion, which is based on three critical perspectives on digital in education.

The first is “Digital as Capital”. Here, Knox talks about the commercialisation and capitalisation of digital platforms, such as social media. This platform model is increasingly based on the commodification of data, and so inevitably students/teachers/learners become seen as something which can be analysed (eg learning analytics), or something under surveillance. If surveillance is equated to recognition, this leads to further (perhaps troubling?) implications. Do you need to be seen to be counted as a learner? Is learning always visible? Does this move away from the idea of the web and digital being ‘social constructivist’?

Secondly, Knox looks at “Digital as Policy”. This (for me) slightly more familiar ground discusses the idea that ‘digital’ education is no longer as separate or distinct from ‘education’ as it once was. In a ‘post-digital’ understanding, it is in fact mainstream rather than alternative or progressive. The digital in education, however, often manifests as metrification in governance – eg schools are searchable in rankings based on algorithms. In this sense, ‘digital education’ moves away from ‘classroom gadgets’ (as Knox puts it) and sees it as something intrinsic and embedded in policy, with strategic influence.

Lastly, he discusses “Digital as Material”, which focuses on surfacing the hidden material dimensions of a sector which was often seen as ‘virtual’ and therefore ‘intangible’. The tangible, material aspects of digital education include devices, servers, and other physical elements which require manual labour and material resources. On one hand there is efficiency, but on the other there is always labour. As education, particularly digital education, often comes from a sense of social egalitarianism and social justice, this is a troubling realisation, and one which lead to a rethink in the way digital education is positioned in a post-digital perspective.

In conclusion, Knox suggests that ‘post-digital’ should be understood as a ‘holding to account of the many assumptions associated with digital technology’, which I feel sums up his argument and is probably something we should try and do more of regardless of whether we’re a ‘digital’ or ‘post-digital’ education office.

What’s the problem with learning analytics? by Neil Selwyn (read by Michael Marcinkowski)

For this last session I read Neil Selwyn’s ‘What’s the Problem with Learning Analytics’ from the Journal of Learning Analytics. Appearing in a journal published by the Society for Learning Analytics Research, Selwyn’s socio-technical approach toward the analysis of learning analytics was a welcome, if somewhat predictable, take on a field that too often seems to find itself somewhat myopically digging for solution to its own narrow set of questions.

Setting learning analytics within a larger social, cultural, and economic field of analysis, Selwyn lays out an orderly account of a number of critical concerns, organized around the implications and values present in learning analytics.

Selwyn lists these consequences of learning analytics as areas to be questioned:

  1. A reduced understanding of education: instead of a holistic view of education it is reduced to a simple numerical metric.
  2. Ignoring the broader social contexts of education: there is a danger that by limiting the understanding of education that we ignore important contextual factors affecting education.
  3. Reducing students’ and teachers’ capacity for informed decision-making: the results of learning analytics comes to overtake other types of decision making.
  4. A means of surveillance rather than support: in their use, learning analytics can have more punitive rather than pedagogical implications.
  5. A source of performativity: students and teachers each begin to focus on achieving results that can be measured by analytics rather than other measures of learning.
  6. Disadvantaging a large number of people: like any data driven system, decisions about winners and losers can be unintentionally baked into the system.
  7. Servicing institutional rather than individual interests: the analytics has more direct benefit for educational institutions and analytic providers than it does for students.

He goes on to list several questionable values embedded in learning analytics:

  1. A blind faith in data: There is a risk that there is a contemporary over-valuation of the importance of data.
  2. Concerns over the data economy: What are the implications when student data is monetized by companies?
  3. The limits of free choice and individual agency: Does a reliance on analytic data remove the ability of students and educators to have a say in their education?
  4. An implicit techno-idealism: Part of leaning analytics is a belief in the benefits of the impacts of technology.

Toward this, Selwyn proposes a few broad areas for change designed to improve learning analytics’ standing within a wider field of concern:

  1. Rethink the design of learning analytics: allow for more transparency and customization for students.
  2. Rethink the economics of learning analytics: give students ownership of their data.
  3. Rethink the governance of learning analytics: establish regulatory oversite for student data.
  4. A better public understanding of learning analytics: educate the wider public of the ethical implications of the application of learning analytics to student data.

Overall, Selwyn’s main point remains the most valuable: the idea of learning analytics should be examined within the full constellation of social and cultural structures within which it is embedded. Like any form of data analytics, learning analytics does not exist as a perfect guide to any action, and the insights that are generated by it need to be understood as only partial and implicated by the mechanisms designed to generate the data. In the end, Selwyn’s account is a helpful one — it is useful to have such critical voices welcomed into SOLAR — but the level at which he casts his recommendations remains too broad for anything other than a starting point. Setting clear policy goals and fostering a broad understanding of learning analytics are hopeful external changes that can be made to the context within which learning analytics is used, but in the end, what is necessary is for those working in the field of learning analytics who are constructing systems of data generation and analysis to alter the approaches that they take, both in the ‘ownership’ and interpretation of student data. This enforces the need for how we understand what ‘data’ is and how we think about using it to change. Following Selwyn, the most important change might be to re-evaluate the ontological constitution of data and our connection to it, coming to understand it not as something distinct from students’ education, but an integral part of it.

Valuing technology-enhanced academic conferences for continuing professional development. A systematic literature. Professional Development in Education by Maria Spilker (read by Naomi Beckett )

This literature review gives an analysis of the different approaches taken to enhance academic conferences technologically for continued professional development. Although there have been advances and new practices emerging, a definite coherent approach was lacking. Conferences were being evaluated in specific ways that were not considering all sides.

‘Continued professional development for academics is critical in times of increased speed and innovation, and this intensifies the responsibilities of academics.’ 

This makes it more important to ensure when academics come together at a conference, there is a systematic approach to look at what they should be getting out of the time spent there. The paper suggests this is something that needs to be looked out when first starting to plan a conference, what are the values?

The paper talks about developing different learning experiences at a conference to engage staff and build their professional development. There is often little time for reflection and the paper suggests looking at more ways to include this. Using technology is an example of a way this could be done. Engagement on Twitter for example gives users another channel to discuss and network, and this takes them away from the normal traditional conference formats.  Making more conferences online also gives users the opportunities to reach out to further networks.

The paper mentions their Value Creation Network, looking at what values we should be taking out of conferences. These include, immediate value, potential value, applied value, realised value, and re-framing value. Looking at these to begin with is a good start to thinking about how we can frame academic conferences, so delegates get the most use out of the time spent there, and work on their own professional development too.

We asked teenagers what adults are missing about technology. This was the best response by Taylor Fang (read by Paddy Uglow)

Some thoughts I took away:

  • Traditionally a “screen” was a thing to hide or protect, and to aid privacy. Now it’s very much the opposite.
  • Has society changed so much that social media is the only place that young people can express themselves and build a picture of who they are and what their place is in the world?
  • Adults have a duty to help young people find other ways to show who they are to the world (and the subsequent reflection back to themself)
  • Digital = data = monetisation: everything young people do online goes into a money-making system which doesn’t have their benefit as its primary goal.
  • Young people are growing up in a world where their importance and value is quantified by stats, likes, shares etc, and all these numbers demonstrate to them that they’re less important than other people, and encourages desperate measures to improve their metrics.
  • Does a meal/holiday/party/etc really exist unless it’s been published and Liked?
  • Does the same apply to Learning Analytics? Are some of the most useful learning experiences those which don’t have a definition or a grade?

Suggested reading

Digital Accessibility and Neurodiversity

Dafydd presenting to lecture theatreLast week we hosted the third of our Digital Accessibility events, this time with Dafydd Henke-Reed, Senior Accessibility Consultant with AbilityNet. Dafydd has been diagnosed with Autism and Dyslexia and spoke about his personal experiences of Neurodiversity.

Dafydd was engaging and open about his experiences growing up, going to University and the technology he uses day to day. From the very start he highlighted that Autism is a spectrum and that we were hearing what Neurodiversity means to him.

From Cognitive Brick Walls to being horrified when friendly lecturers asked him to move forward from the back row of a lecture theatre, we heard about the barriers and obstacles he had faced.

What stood out for me

 

“Dyslexia could be solved with tools; Autism was about learning how to thrive in a seemingly hostile culture.”

Dafydd had refused support related to Autism at University. Tactics such as large yellow “appropriate allowance when marking” stickers felt like a brand. This is pertinent; many students may not disclose their “disabilities” due to previous experience or because they find allowances intrusive or counterproductive. In fact, with conditions such as Autism Spectrum Disorder may not consider it a disability in the first case, it’s just the way they are. If we are to be truly inclusive, then we need to design our learning experience to remove barriers and everyone benefits.

“Come over for group study and we’ll get beers and Pizza in? Hell no!”

Dafydd spoke about how he found groups and teamwork challenging. He’ll use digital tools like Slack or instant messaging to communicate rather than walking to a colleague’s desk. He also praised electronic tickets (“I won’t lose them”)

He showed us the Speech to Text (STT) and Text to Speech (TTS) systems he uses every day along with the spelling correction functionality.

“Not good enough”

Dafydd will obsess about making things perfect. From essays with over 20 drafts to repeatedly painting his bathroom wall until a relative intervened to say it was fine, he needed regular feedback to help get past the compulsion to improve something.

Do’s and Don’ts

 

The excellent UK Gov “Do’s and Don’ts” guides were given a name check again, this time for Dyslexia and Autism. If you haven’t seen them, check out these lovely visual guide posters. I think they should be printed out in every office!

Government Designing for disability guides

We have one more session with AbilityNet left on the 5th February looking at Physical Impairment, still a handful of tickets left.

Curriculum design – notes from the reading group

Exploring curriculum design approaches (report by Suzanne Collins)

Suzanne talked about her work exploring curriculum design approaches (separate blog post) where she looks at looks at methodologies such as ABC, Carpe Diem, CAIeRO and ELDeR.

ABC Learning Design (notes by Suzi Wells)

ABC learning design is a rapid design / review methodology developed by Clive Young and Nataša Perović in 2014, and drawing on Laurillard’s ‘Conversational Framework’.

The method is centered around 90 minute workshop, during which participants:

  • Describe their unit in a tweet
  • Map out the types of activity, currently undertaken or planned, against Laurillard’s six learning types
  • Storyboard the unit using the ABC cards

In the DEO a few of us – Roger Gardner, Suzanne Collins, and I – have trialled this approach. Initially this was with a small number of academics interested in redesigning their credit-bearing units. We made much fuller use of it when supporting the design of the FutureLearn courses, following which Suzanne and I presented on this at a UCL conference in 2018: our presentation on using ABC at Bristol.

One advantage of the methodology is that you could run a single 90 minute workshop looking at an entire programme, allowing potential links between the units to become apparent. The short length of the workshop gives at least some chance to get everyone from the unit together in one place.

The cards are Creative Commons licensed and have been widely adapted, adding in activities and terminology that is more relevant to their context. On the ABC site you can download ABC cards for MOOCs designed for FutureLearn and EdX. At the conference we heard how people have used stickers in the storyboarding stage to surface things that they are interested in: employability skills, alignment with the education strategy, and university-wide themes (such as Bristol’s global citizenship, innovation & enterprise, and sustainable futures themes).

Obviously a 90 minute workshop is not going to give you time to finalise many details but ABC is quick to learn, very adaptable, and sparks good conversations. It’s remarkable how much can be done in a short time.

Beyond podcasting: creative approaches to designing educational audio (notes by Chrysanthi Tseloudi)

This paper talks about a pilot that aimed to encourage academics to use podcasts in their teaching through a tool in their VLE. The pilot included initial workshops and various types of support for the 25 participants that decided to try it out. All participants used audio, apart from 1 team, which used video podcasts. 9 of them shared their experience for this paper. They had produced various types of resources: videos about clinical techniques (nursing), audio based on research projects which also received audio feedback from the academic (sport), “questions you’re afraid to ask” (art & design), answers to distance learning students’ questions to reduce the sense of isolation in the VLE (communications), etc.

Academics enjoyed using audio for learner-centred pedagogies, but they also encountered some barriers. Expectations for high quality may be a barrier for both staff and students, while assessing student work in this format is time consuming. Not being familiar with the technology can be frustrating and impeding for staff, as they would rather not ask students to do something they didn’t feel confident they could do well themselves. Students are not necessarily more confident than them in using this technology. Following the pilot, the institution’s capacity to support such activities was evaluated and some solutions to support staff were devised.

This was a nice paper with a variety of ideas on using audio for teaching. I found the point about voice on the VLE increasing connectivity and reducing isolation particularly interesting, and would love to see any relevant research on this.

Suggested reading

Approaches to curriculum design

Related articles

Exploring Curriculum Design Approaches

Recently at the University of Bristol, we’ve all been thinking a lot about learning design, developing curriculum and ways of assessment. BILT’s focus on TESTA for transforming assessment is one way you can see this in action. In higher education, learning design can quickly get complicated – for example redesigning a whole programme – and is increasingly new and exciting – with online or blended aspects, new assessment methods or innovative pedagogies. A method of working when approaching curriculum, programme or learning design can speed up the process and make it much more enjoyable for everyone involved. Helpfully, there are several working methods based on story-boarding which provide a way to navigate this process, and which focus on a team approach to designing learning.

The Digital Education Office have mainly used an approach based on UCL’s ABC: you can read more about our use of this method from a series of blog posts by Suzi Wells and I on a previous ABC conference held at UCL. 

Such curriculum design approaches all facilitate discussion and evaluation of current and future learning designs by bringing together relevant stakeholders, learning design specialists and support staff. In the Sway presentation embedded here, we’ll have a quick look at a few, in order to get a taste of what these approaches involve, and how they’ve been used by others. Follow this link to open the Sway in a new tab or window.

Accessibility and mental health

The second in our series of talks from AbilityNet was from Adam, Service Development Manager, on accessibility and mental health. Adam spoke both from his professional and personal experience, and from his in-depth knowledge of technology. The session was fascinating and very useful. Some highlights for me included…

Helping to understand the complexity of the issues

Adam talked about training, as a runner, as a way of understanding what physical physical pain you can push through and what you can’t – and the idea that the same holds for mental stresses and strains. Some you can push through and some you can’t – and they will be different for different people. He also talked about the idea that there is an increase of perfectionism in younger generations, be that self-improvement, down to social pressure, or outward facing (expecting more of others).

Do’s and don’ts

The accessibility posters produced by GOV.UK are a great set of resources, and people have been adding their own. Adam showed the posters on designing for users with anxiety which would be a really useful checklist for a number of our services

Technology tips

There were lots of great recommendations of apps and tools. The ones that stood out for me were:

  • Word can now check for clarity, conciseness and inclusiveness (for example, unnecessarily gendered language)
  • Presenter Coach which comes free with PowerPoint online and allows you to rehearse your presentation to an AI audience could be useful both to improve your own clarity and to give students a non-threatening way to rehearse
  • Text-to-speech tools are great for proofreading (this is a revelation to me) and also for getting an unemotional reading of emails that have been sent to you
  • Forest app rewards you with your own virtual woodland for spending time away from your phone

Finally, as a back-to-paper fan, I love the idea of Google’s printable phone.

What next?

This was another great session with AbilityNet, the two remaining sessions are:

We’ll be releasing some ‘Top Tips’ videos for each strand after the event. We’ll also try to make recordings of the sessions available.

If you would like to talk to the Digital Education Office team about Digital Accessibility, Blackboard Ally or just have related questions do feel free to contact us via:

Email: digital-education@bristol.ac.uk
Tel: +44 (0)117 42 83055 / internal: 83055