The Impact of AI on Learning Design – notes from the Reading Group

by Maxine Sims

This weeks topic was Using AI starting with the Digital Education Institutes Ebook on The Impact of AI on Learning Design. The group also discussed other AI tools they have experimented with and different thoughts and opinions on where it could be useful or risky in education.

Some key takeaways:

  • AI can be really useful for inspiration and ‘tidying up’ your thoughts and learning content plans.
  • Tools are emerging and changing all the time. They can produce content at a faster and cheaper rate than traditional methods, particularly video production tools.
  • There are great opportunities to level the accessibility of learning by supporting the effort from thought to writing for groups that find this challenging.
  • There is a fear around the uncertainty of it’s uses, and some scepticism of it’s value – particularly when it comes to the suggestion that some traditional jobs may become redundant!
  • Like all new technology it is disruptive and calls into question the value and purpose of education, knowledge and skills. What can we gain and what do we lose are questions that are still debated.
  • Some universities have made progress with creating AI working groups to produce T&L guidance and training and support for staff. For example Sheffield and Oxford.
  • There are potential uses for AI to free up staff time (a common complaint) to focus on more ‘meaningful’ work if we can find a comfortable way forward with what data we share and what tasks can be handled by the tech.

Innovating Pedagogy 2024 – notes from the reading group

This week we revived our long-running reading group, partly inspired by London’s Digital Education Reading Group. Our first topic was Innovating Pedagogy 2024 from the Open University. We each read and gave a brief summary of one section, with a little time for questions and discussion. Plenty of food for thought – a really enjoyable and useful conversation. Some recurring themes:

  • Potentials of generative AI
  • Unknowns, risks and costs of AI
  • Inclusion and personalisation
  • Exclusion caused by lack of access to technology or the skills to engage successfully
  • Creating emotionally safe practice spaces – chatbots for rubber duck debugging

We’re running it online to make it easier for people to join. Though it worked well, it definitely feels like it makes free-flowing discussion slower and more difficult. Looking forward to future meetings and seeing how the format evolves over time.

Where are we now? – notes from the reading group

For our first reading group since March, and our first ever online, we looked at any recent (post-COVID) articles on education. It was a somewhat eclectic selection but it was very good to be back together!

Moving Into the Long Term By Lilah Burke and A Renewed Focus on the Practice of Teaching by Shigeru Miyagawa and Meghan Perdue (notes by Suzi Wells) 

These two short articles reflected on the staff (and student) experience of teaching since March. 

Miyagawa and Perdue interviewed more than 30 faculty members at MIT about their experiences. The themes of their responses seem familiar from our experience in Bristol:

  • Many staff voiced an increased interest in the practice of teaching
  • Teaching has been more challenging and at times more rewarding – the crisis has forced us to come up with creative solutions to problems, which can be exciting
  • COVID has forced us to re-evaluate what is important, being unable to rely on face-to-face where we (think we) already know what works
  • Testing students online is harder and staff are questioning why and how much it is needed

A lot of what was covered in the Burke article is not surprising: students (and academics) feeling more isolated, and struggling with the difference between their expectations and where we now find ourselves. One of the people interviewed raised the point that so much has changed it will be hard to measure whether learning has suffered (or indeed improved). This seemed interesting to me and made me wonder what we can meaningfully measure, and in particular whether we can measure or evaluate what we learn from just dealing with a crisis like this.

How universities can ensure students still have a good experience, despite coronavirus (notes by Chrysanthi Tseloudi)

The article suggests 3 things universities can do to improve students’ experience during coronavirus (and in general).

  1. Listen: Survey students regularly, make changes based on the answers and communicate these to students.
  2. Communicate: via multiple channels (email is not the best for students), explain from a student’s point of view, tailored to different students.
  3. Invest: in hardware, software, networking capacity, staff training to ensure quality, consistency and innovation.

Just in time CPD by Virna Rossi (notes by Michael Marcinkowski)

This piece offered personal reflections on support strategies for helping teaching staff adapt to online teaching in the wake of COVID-19. The author highlighted the use of a staff-wide chat built into the University’s VLE and detailed the trials and tribulations of trying to answer questions posted by staff in video form. Though mostly a personal reflection on the processes, this piece did contain a number of salient details:

  1. The author tried to use video responses to questions in order to evoke a sense of being present there with teaching staff. Well being, both for staff and students was a prime concern, as evidenced by the questions and utilization of support materials related to well being, though it remains an open question whether or not the use of video in this case had its intended impact. What can be said is that the author found the process of video production to be time consuming.
  2. They also consciously used “low tech” aspects in their demonstrations of online teaching for staff which the utilized with the belief that they would make the staff feel more comfortable about making less-than-perfect resources. This included creating hand drawn slides for use in video presentations.

Overall, the article was an interesting read in the personal detail that it provided, however it had little substantive advice to build on, outside of the general claim regarding the importance of support and a concern for staff well-being. 

Designing out plagiarism for online assessment (notes by Hannah Gurr)

246 reasons to cheat: outsourcing from essay mills is a way for students to ‘quit’ without losing the qualification they were working towards. So may turn to this type of cheating due to an inability to handle academic workload or an unwillingness to do so.

HE Institutions need to know why plagiarism happens, while students need to come to understand the range of ways in which plagiarism can occur. HEIs need a developmental approach in formative assignments to help students know how to avoid plagiarism. The academic community also needs to place a positive focus on academic integrity (e.g. UoB 6 values of honesty, trust, fairness, respect, responsibility, courage), not just a negative focus on misconduct.

A Different Way to Deliver Student Feedback (lessons from the performing arts for STEM) (notes by Chrysanthi Tseloudi)

Tough-love feedback on open-ended work usually doesn’t work well. Students don’t receive it well and may feel alienated, while instructors often shift the blame to them for not being able to handle critical feedback.

The method described (based in arts, but in this article aimed at STEM) attempts to shift the dynamics and give the student power over the feedback they receive. It features 3 roles and 4 steps:

Roles: the artist (student), the responder (instructor/ student peer/ feedback giver, etc) and the facilitator (neutral party, optional).

Steps: 

  1. Statements of Meaning: Responders provide positive feedback about something they found meaningful, interesting, or exciting in the work.
  2. Artist as Questioner: The student asks questions about their work, focusing on the feedback they need at the moment and responders reply to these questions.
  3. Neutral Questions: Responders ask neutral questions (questions without hidden comments/ opinions) about the work, and the student responds.
  4. Opinion Time: Responders can give any other feedback they want – but only if given permission by the student. Students often don’t feel they can say no, so they will need to be reassured that they can.

Writer’s takeaway: Even if not using this method, it’s useful to ask the student what particular feedback they want at that moment. They may be surprised, as many have never been asked before. It will take them a bit of time to get used to it. But once they feel secure, tough love won’t be needed for their work to improve.

Virtual Learning Should and Can Be Hands-On (focus on labs) by Alexis R. Abramson (notes by Paddy Uglow)

Course leaders at Dartmouth College were able to keep the hands-on learning qualities of their engineering courses in the following ways:

  • $200 mini 3D printers were sent to students
  • Some lab equipment was adapted for remote operation
  • Hardware kits were sent to students containing cheap components that could be used to carry out experiments and demonstrate principals.
  • Students and staff used their imagination and home resources to replace lab-based equipment

The Reading Group discussed the article, and talked about the advantages of these methods and the use of VR video (of experiments and medical procedures). These included:

  • A real sense of “getting your hands dirty” (eg leaking chemicals, mistakes in following procedure, spillages, etc) which can’t be replicated with a computer-based version (it would be interesting to compare student performance between those learning virtually and physically – medical students practice injections on oranges, for example)
  • There’s no queuing for equipment or being unable to see properly when a demonstration is given
  • Lab experiments are often done in groups, and sometimes one person rushes ahead and doesn’t let the rest of their group gain a full understanding of what’s happening. Working at home with a kit, each student has to do it themselves, or at least gain the learning experience of why they’ve been unable to do it.

During the discussion, it was found that University of Bristol has been using similar techniques.

National Institute for Digital Learning good reads from 2019 – notes from the reading group

Do MOOCs contribute to student equity and social inclusion? A systematic review by Sarah R Lambert (read by Suzi Wells)

This study is a large literature review looking at both empirical research and practitioner guidance around using MOOCs and other open (as in free) learning to promote student equity and social inclusion. The study starts from 2014 because that’s the second wave of MOOCs, where more stable and sustainable practice begins to emerge.

The aims of the MOOCs were broken into two broad areas: those focused on student equity (making education fairer for tertiary learners and those preparing to enrol in HE) – this was the most common aim; and those who sought to improve social inclusion (education of non-award holders & lifelong learners). The literature was predominantly from US, UK, Australia, but they were only studying literature – possibly unsurprisingly for articles written in English. There was a near 50/50 split between empirical research and policy/practice recommendations, and the studies focused slightly more on MOOCs than on other other open learning. One notable finding was that the success rate (among published studies at least) was high – more often than not they met or exceeded their aims.

Lambert includes lots of useful detail about factors that may have led to successful projects. MOOC developers should learn about their learners and make content relevant and relatable to them. Successful projects often had community partners involved in the design, delivery & support – in particular, initiatives with large cohorts (~100) that were very successful all had this. Designing for the learners meant things like: designing for mobile-only and offline access, teaching people in their own language (or at least providing mother-tongue facilitation) and, where relevant, mixing practitioners with academics in the content.

Facilitating and support the learning was also key to success. Local study groups or face-to-face workshops were used by some projects to provide localisation and contextualisation. Facilitators would ideally be drawn from existing community networks.

A related point was to design content from scratch – recycling existing HE materials was not as successful. This should be done in an interdisciplinary team and/or community partnership. Being driven entirely by an IT or digital education team was an indicator that a project would not meet its aims. Projects need technical expertise but education and/or widening participation too. Open as is free-to-use is fine, licence didn’t seem to have an impact.

In short:

  • Work with the people you intend to benefit.
  • Create, don’t recycle.
  • Don’t expect the materials to stand by themselves.

If you’re interested in social justice through open learning, think OU not OERs.

What does the ‘Postdigital’ mean for education? Three critical perspectives on the digital, with implications for educational research and practice by Jeremy Knox (read by Suzanne Collins)

This article explores the idea of what ‘post-digital’ education means, specifically thinking about human-technology relationships. It begins with an analysis of the term post-digital’, embracing the perspective of ‘post’ as a critical appraisal of the understanding of digital rather than simply meaning a different stage, after, the digital. This initial analysis is worth a read, but not my main focus for this reading group so here I’ll jump straight to the main discussion, which is based on three critical perspectives on digital in education.

The first is “Digital as Capital”. Here, Knox talks about the commercialisation and capitalisation of digital platforms, such as social media. This platform model is increasingly based on the commodification of data, and so inevitably students/teachers/learners become seen as something which can be analysed (eg learning analytics), or something under surveillance. If surveillance is equated to recognition, this leads to further (perhaps troubling?) implications. Do you need to be seen to be counted as a learner? Is learning always visible? Does this move away from the idea of the web and digital being ‘social constructivist’?

Secondly, Knox looks at “Digital as Policy”. This (for me) slightly more familiar ground discusses the idea that ‘digital’ education is no longer as separate or distinct from ‘education’ as it once was. In a ‘post-digital’ understanding, it is in fact mainstream rather than alternative or progressive. The digital in education, however, often manifests as metrification in governance – eg schools are searchable in rankings based on algorithms. In this sense, ‘digital education’ moves away from ‘classroom gadgets’ (as Knox puts it) and sees it as something intrinsic and embedded in policy, with strategic influence.

Lastly, he discusses “Digital as Material”, which focuses on surfacing the hidden material dimensions of a sector which was often seen as ‘virtual’ and therefore ‘intangible’. The tangible, material aspects of digital education include devices, servers, and other physical elements which require manual labour and material resources. On one hand there is efficiency, but on the other there is always labour. As education, particularly digital education, often comes from a sense of social egalitarianism and social justice, this is a troubling realisation, and one which lead to a rethink in the way digital education is positioned in a post-digital perspective.

In conclusion, Knox suggests that ‘post-digital’ should be understood as a ‘holding to account of the many assumptions associated with digital technology’, which I feel sums up his argument and is probably something we should try and do more of regardless of whether we’re a ‘digital’ or ‘post-digital’ education office.

What’s the problem with learning analytics? by Neil Selwyn (read by Michael Marcinkowski)

For this last session I read Neil Selwyn’s ‘What’s the Problem with Learning Analytics’ from the Journal of Learning Analytics. Appearing in a journal published by the Society for Learning Analytics Research, Selwyn’s socio-technical approach toward the analysis of learning analytics was a welcome, if somewhat predictable, take on a field that too often seems to find itself somewhat myopically digging for solution to its own narrow set of questions.

Setting learning analytics within a larger social, cultural, and economic field of analysis, Selwyn lays out an orderly account of a number of critical concerns, organized around the implications and values present in learning analytics.

Selwyn lists these consequences of learning analytics as areas to be questioned:

  1. A reduced understanding of education: instead of a holistic view of education it is reduced to a simple numerical metric.
  2. Ignoring the broader social contexts of education: there is a danger that by limiting the understanding of education that we ignore important contextual factors affecting education.
  3. Reducing students’ and teachers’ capacity for informed decision-making: the results of learning analytics comes to overtake other types of decision making.
  4. A means of surveillance rather than support: in their use, learning analytics can have more punitive rather than pedagogical implications.
  5. A source of performativity: students and teachers each begin to focus on achieving results that can be measured by analytics rather than other measures of learning.
  6. Disadvantaging a large number of people: like any data driven system, decisions about winners and losers can be unintentionally baked into the system.
  7. Servicing institutional rather than individual interests: the analytics has more direct benefit for educational institutions and analytic providers than it does for students.

He goes on to list several questionable values embedded in learning analytics:

  1. A blind faith in data: There is a risk that there is a contemporary over-valuation of the importance of data.
  2. Concerns over the data economy: What are the implications when student data is monetized by companies?
  3. The limits of free choice and individual agency: Does a reliance on analytic data remove the ability of students and educators to have a say in their education?
  4. An implicit techno-idealism: Part of leaning analytics is a belief in the benefits of the impacts of technology.

Toward this, Selwyn proposes a few broad areas for change designed to improve learning analytics’ standing within a wider field of concern:

  1. Rethink the design of learning analytics: allow for more transparency and customization for students.
  2. Rethink the economics of learning analytics: give students ownership of their data.
  3. Rethink the governance of learning analytics: establish regulatory oversite for student data.
  4. A better public understanding of learning analytics: educate the wider public of the ethical implications of the application of learning analytics to student data.

Overall, Selwyn’s main point remains the most valuable: the idea of learning analytics should be examined within the full constellation of social and cultural structures within which it is embedded. Like any form of data analytics, learning analytics does not exist as a perfect guide to any action, and the insights that are generated by it need to be understood as only partial and implicated by the mechanisms designed to generate the data. In the end, Selwyn’s account is a helpful one — it is useful to have such critical voices welcomed into SOLAR — but the level at which he casts his recommendations remains too broad for anything other than a starting point. Setting clear policy goals and fostering a broad understanding of learning analytics are hopeful external changes that can be made to the context within which learning analytics is used, but in the end, what is necessary is for those working in the field of learning analytics who are constructing systems of data generation and analysis to alter the approaches that they take, both in the ‘ownership’ and interpretation of student data. This enforces the need for how we understand what ‘data’ is and how we think about using it to change. Following Selwyn, the most important change might be to re-evaluate the ontological constitution of data and our connection to it, coming to understand it not as something distinct from students’ education, but an integral part of it.

Valuing technology-enhanced academic conferences for continuing professional development. A systematic literature. Professional Development in Education by Maria Spilker (read by Naomi Beckett )

This literature review gives an analysis of the different approaches taken to enhance academic conferences technologically for continued professional development. Although there have been advances and new practices emerging, a definite coherent approach was lacking. Conferences were being evaluated in specific ways that were not considering all sides.

‘Continued professional development for academics is critical in times of increased speed and innovation, and this intensifies the responsibilities of academics.’ 

This makes it more important to ensure when academics come together at a conference, there is a systematic approach to look at what they should be getting out of the time spent there. The paper suggests this is something that needs to be looked out when first starting to plan a conference, what are the values?

The paper talks about developing different learning experiences at a conference to engage staff and build their professional development. There is often little time for reflection and the paper suggests looking at more ways to include this. Using technology is an example of a way this could be done. Engagement on Twitter for example gives users another channel to discuss and network, and this takes them away from the normal traditional conference formats.  Making more conferences online also gives users the opportunities to reach out to further networks.

The paper mentions their Value Creation Network, looking at what values we should be taking out of conferences. These include, immediate value, potential value, applied value, realised value, and re-framing value. Looking at these to begin with is a good start to thinking about how we can frame academic conferences, so delegates get the most use out of the time spent there, and work on their own professional development too.

We asked teenagers what adults are missing about technology. This was the best response by Taylor Fang (read by Paddy Uglow)

Some thoughts I took away:

  • Traditionally a “screen” was a thing to hide or protect, and to aid privacy. Now it’s very much the opposite.
  • Has society changed so much that social media is the only place that young people can express themselves and build a picture of who they are and what their place is in the world?
  • Adults have a duty to help young people find other ways to show who they are to the world (and the subsequent reflection back to themself)
  • Digital = data = monetisation: everything young people do online goes into a money-making system which doesn’t have their benefit as its primary goal.
  • Young people are growing up in a world where their importance and value is quantified by stats, likes, shares etc, and all these numbers demonstrate to them that they’re less important than other people, and encourages desperate measures to improve their metrics.
  • Does a meal/holiday/party/etc really exist unless it’s been published and Liked?
  • Does the same apply to Learning Analytics? Are some of the most useful learning experiences those which don’t have a definition or a grade?

Suggested reading

Curriculum design – notes from the reading group

Exploring curriculum design approaches (report by Suzanne Collins)

Suzanne talked about her work exploring curriculum design approaches (separate blog post) where she looks at looks at methodologies such as ABC, Carpe Diem, CAIeRO and ELDeR.

ABC Learning Design (notes by Suzi Wells)

ABC learning design is a rapid design / review methodology developed by Clive Young and Nataša Perović in 2014, and drawing on Laurillard’s ‘Conversational Framework’.

The method is centered around 90 minute workshop, during which participants:

  • Describe their unit in a tweet
  • Map out the types of activity, currently undertaken or planned, against Laurillard’s six learning types
  • Storyboard the unit using the ABC cards

In the DEO a few of us – Roger Gardner, Suzanne Collins, and I – have trialled this approach. Initially this was with a small number of academics interested in redesigning their credit-bearing units. We made much fuller use of it when supporting the design of the FutureLearn courses, following which Suzanne and I presented on this at a UCL conference in 2018: our presentation on using ABC at Bristol.

One advantage of the methodology is that you could run a single 90 minute workshop looking at an entire programme, allowing potential links between the units to become apparent. The short length of the workshop gives at least some chance to get everyone from the unit together in one place.

The cards are Creative Commons licensed and have been widely adapted, adding in activities and terminology that is more relevant to their context. On the ABC site you can download ABC cards for MOOCs designed for FutureLearn and EdX. At the conference we heard how people have used stickers in the storyboarding stage to surface things that they are interested in: employability skills, alignment with the education strategy, and university-wide themes (such as Bristol’s global citizenship, innovation & enterprise, and sustainable futures themes).

Obviously a 90 minute workshop is not going to give you time to finalise many details but ABC is quick to learn, very adaptable, and sparks good conversations. It’s remarkable how much can be done in a short time.

Beyond podcasting: creative approaches to designing educational audio (notes by Chrysanthi Tseloudi)

This paper talks about a pilot that aimed to encourage academics to use podcasts in their teaching through a tool in their VLE. The pilot included initial workshops and various types of support for the 25 participants that decided to try it out. All participants used audio, apart from 1 team, which used video podcasts. 9 of them shared their experience for this paper. They had produced various types of resources: videos about clinical techniques (nursing), audio based on research projects which also received audio feedback from the academic (sport), “questions you’re afraid to ask” (art & design), answers to distance learning students’ questions to reduce the sense of isolation in the VLE (communications), etc.

Academics enjoyed using audio for learner-centred pedagogies, but they also encountered some barriers. Expectations for high quality may be a barrier for both staff and students, while assessing student work in this format is time consuming. Not being familiar with the technology can be frustrating and impeding for staff, as they would rather not ask students to do something they didn’t feel confident they could do well themselves. Students are not necessarily more confident than them in using this technology. Following the pilot, the institution’s capacity to support such activities was evaluated and some solutions to support staff were devised.

This was a nice paper with a variety of ideas on using audio for teaching. I found the point about voice on the VLE increasing connectivity and reducing isolation particularly interesting, and would love to see any relevant research on this.

Suggested reading

Approaches to curriculum design

Related articles

Accessibility and mental health

The second in our series of talks from AbilityNet was from Adam, Service Development Manager, on accessibility and mental health. Adam spoke both from his professional and personal experience, and from his in-depth knowledge of technology. The session was fascinating and very useful. Some highlights for me included…

Helping to understand the complexity of the issues

Adam talked about training, as a runner, as a way of understanding what physical physical pain you can push through and what you can’t – and the idea that the same holds for mental stresses and strains. Some you can push through and some you can’t – and they will be different for different people. He also talked about the idea that there is an increase of perfectionism in younger generations, be that self-improvement, down to social pressure, or outward facing (expecting more of others).

Do’s and don’ts

The accessibility posters produced by GOV.UK are a great set of resources, and people have been adding their own. Adam showed the posters on designing for users with anxiety which would be a really useful checklist for a number of our services

Technology tips

There were lots of great recommendations of apps and tools. The ones that stood out for me were:

  • Word can now check for clarity, conciseness and inclusiveness (for example, unnecessarily gendered language)
  • Presenter Coach which comes free with PowerPoint online and allows you to rehearse your presentation to an AI audience could be useful both to improve your own clarity and to give students a non-threatening way to rehearse
  • Text-to-speech tools are great for proofreading (this is a revelation to me) and also for getting an unemotional reading of emails that have been sent to you
  • Forest app rewards you with your own virtual woodland for spending time away from your phone

Finally, as a back-to-paper fan, I love the idea of Google’s printable phone.

What next?

This was another great session with AbilityNet, the two remaining sessions are:

We’ll be releasing some ‘Top Tips’ videos for each strand after the event. We’ll also try to make recordings of the sessions available.

If you would like to talk to the Digital Education Office team about Digital Accessibility, Blackboard Ally or just have related questions do feel free to contact us via:

Email: digital-education@bristol.ac.uk
Tel: +44 (0)117 42 83055 / internal: 83055

Notes from the reading group – free choice

Digital wellbeing toolkit from BBC R&D (notes from Suzi Wells)

This is a toolkit from BBC R&D is designed for addressing wellbeing in digital product development. Given the increasing focus on student wellbeing, I was interested in whether it could be useful for online & blended learning.

The key resources provided are a set of values cards along with some exploration of what these mean, particularly for young people (16-34 year olds). These were developed by the BBC through desk research, focus groups and surveys.

The toolkit – the flashcards especially – could certainly be used to sense-check whether very digital teaching methods are really including and supporting the kinds of things we might take for granted in face-to-face education. It could also be useful for working with students to discuss the kinds of things that are important to them in our context, and identifying the kinds of things that really aren’t.

Some of the values seem too obvious (“being safe and well”, “receiving recognition”), or baked-in to education (“achieving goals”, “growing myself”), and I worry that could be off-putting to some audiences. The language also seemed a little strange “human values” – as though “humans” were an alien species. It can feel like the target audience for the more descriptive parts would be someone who has never met a “human”, much less a young one. Nonetheless, the flashcards in particular could be a useful way to kick off discussions.

Three example flash cards showing the values: being inspired, expressing myself, having autonomy

New studies show the cost of student laptop use in lecture classes (notes from Michael Marcinkowski)

The article I read for this session highlighted two new-ish articles which studied the impact that student laptop use had on learning during lectures. In the roughly ten years that laptops have been a common sight in lecture halls, a number of studies have looked at what impact they have on notetaking during class. These previous studies have frequently found a negative association with laptop use for notetaking in lectures, not only for the student using the laptop, but also for other students sitting nearby, distracted by the laptop screen.

The article took a look at two new studies that attempted to tackle some of the limitations of previous work, particularly addressing the correlative nature of previous findings: perhaps low performing students prefer to use laptops for notetaking so that they can do something else during lectures.

What bears mentioning is that there is something somewhat quaint about studying student laptop use. In most cases, it seems to be a foregone conclusion and there is no getting it back into the box. Students will use laptops and other digital technologies in class — there’s almost no other option at this point. Nevertheless, the studies proved interesting.

The first of the highlighted studies featured an experimental set up, randomly assigning students in different sections of an economics class to different conditions: notetaking with laptop, without laptop, or with tablet laying flat on the desk. The last condition was designed to test the effect of students’ being distracted by seeing other students’ screens; the supposition being that if the tablet was laid flat on a desk, it wouldn’t be visible to other students. The students’ performance was then measured based on a final exam taken by students across the three conditions.

After controlling for demographics, GPA, and ACT entrance exam scores, the research found that performance was lower for students using digital technologies for notetaking. However, while performance was lower on the multiple choice and short answer sections of the exam, performance on the essay potion of the exam was the same across all three conditions.

While the study did address some shortcomings of previous studies (particularly with its randomized experimental design), it also introduced several others. Importantly it raised questions about how teachers might teach differently when faced with a class of laptop users or what effect forcing a student who isn’t comfortable using a laptop might have on their performance. Also, given that multiple sections of an economics class was the subject of the study, what role does the discipline being lectured on play in the impact of laptop use?

The second study attempted to address those though a novel design which linked students’ propensity to use or not use laptops in optional-use classes based on whether or not they were forced to or forced not to use them in another class on the same day. Researchers looked at institution-wide student performance at an institution that had a mix of classes which required, forbade, or had no rules about laptop use.

By looking at student performance in classes in which laptop use was optional, but by linking that performance to whether students would be influenced in their laptop choices based on other classes held the same day, researchers wanted to be able to measure student performance when they had a chance not to use a laptop in class. That is, the design allowed researchers to understand in general how many students might be using a laptop in a laptop-optional class, but still allowing individual students to make a choice based on preference.

What they found was that student performance worsened for classes that shared a day with laptop mandated classes and improved on days when classes were shared with laptop prohibited classes. This is in line with previous studies, but interestingly, the negative effects were seen more strongly in weaker students and in quantitative classes.

In the end, even while these two new studies reinforce what had been previously demonstrated about student laptop use, is there anything that can be done to counter what seem to be the negative effects of laptop use for notetaking? More than anything, what seems to be needed are studies looking at how to boost student performance when using technology in the classroom.

StudyGotchi: Tamagotchi-Like Game-Mechanics to Motivate Students During a Programming Course & To Gamify or Not to Gamify: Towards Developing Design Guidelines for Mobile Language Learning Applications to Support User Experience; 2 poster/ demo papers in the EC-TEL 2019 Proceedings. (notes from Chrysanthi Tseloudi)

Both papers talk about the authors’ findings on gamified applications related to learning.

The first regards the app StudyGotchi, based on Tamagochi (a virtual pet the user takes care of), which aims to encourage first year Java programming students to complete tasks on their Moodle platform in order to keep a virtual teacher happy. Less than half the students downloaded the relevant app, with half of those receiving the control version that didn’t have the game functions (180) and half receiving the game version (194). According to their survey, of those that didn’t download it, some reported that was because they weren’t interested in it. Of those that replied to whether they used it, less than half said they did. According to data collected from students that used either version of the app, there was no difference in either the online behaviour or the exam grades of the students, between the groups that used the game or non-game versions. The authors attribute this to the lack of interaction, personalisation options and immediate feedback on the students’ actions on Moodle. I also wonder whether a virtual teacher to be made happy in particular is the best choice of “pet”, when hopefully there is already a real teacher supporting students’ learning. Maybe a virtual brain with regions that light up when a quiz is completed or any non-realistic representation connected to the students’ own development would be more helpful in increasing students’ intrinsic motivation, since ideally they would be learning for themselves, and not to make someone else happy.

The second paper compares 2 language learning apps, one of which is gamified. The non-gamified app (LearnIT ASAP) includes exercises where students fill in missing words, feedback to show whether the answer is correct/ incorrect and statistics to track progress. The gamified app (Starfighter) includes exercises where the students steer through an asteroid field by selecting answers to given exercises and a leaderboard to track progress and compete with peers. The evaluation involved interviewing 11 20-50 year old individuals. The authors found that younger and older participants had different views about the types of interactions and aesthetics of the two apps. Younger participants would have preferred swiping to tapping, older participants seemed to find the non-gamified app comfortable because it looked like a webpage, but were not so sure about the gamified app. The game mechanics of Starfighter were thought to be more engaging, while the pedagogical approach of LearnIT ASAP was thought to be better in terms of instructional value and effectiveness. While the authors mention that the main difference between the apps is gamification, considering the finding that the pedagogical approach of one of the apps is better, I wonder if that is actually the case. Which game elements actually improve engagement and how is still being researched, so I would really like to see comparisons of language learning apps where the existence or not of game elements is indeed the only difference. Using different pedagogical approaches between the apps is less likely to shed light on the best game elements to use, but it does emphasize the difficulty of creating an application that is both educationally valuable and fun at the same time.

AI and education – notes from reading group

Will Higher Ed Keep AI in Check? (notes from Chrysanthi Tseloudi)

In this article, Frederick Singer argues that whether the future of AI in education is a positive or a dystopian one depends not on a decision to use or not use AI, but on retaining control over how it is used.

The author starts by mentioning a few examples of how AI is/ may be used outside of education – both risky and useful. They then move on to AI’s use in educational contexts, with examples including using an AI chat bot for students’ queries regarding enrolment and financial issues, as well as AI powered video transcription that can help accessibility. The area they identify has the most potential for both risk and benefit is AI helping educators address individual students’ needs and indicating when intervention is needed; there are concerns about data privacy and achieving the opposite results, if educators lose control.

The final example they mention is using AI in the admissions process, to sidestep human biases and help identify promising applicants, but without automatically rejecting students that are not identified as promising by the AI tool.

I think this is something to be cautious about. Using AI for assessment – whether for admission, to mark activities, progress, etc – certainly has potential, but AI is not free of human biases. In fact, there have been several examples where they are full of them. The article Rise of the racist robots – how AI is learning all our worst impulses and Cathy O’Neil’s Ted talk The era of blind faith in big data must end report that AI algorithms can be racist and sexist, because they rely on datasets that contain biases already; e.g. a dataset of successful people is essentially a dataset of past human opinions of who can be successful, and human opinions are biased –  if e.g. only a specific group of people have been culturally allowed to be successful, a person that doesn’t belong to that category will not be seen by AI as equally (or more) promising as those who do belong to it. AI algorithms can be obscure, it is not necessarily obvious what they are picking up on to make their judgements and so it’s important to be vigilant and for the scientists who make them to implement ways to counteract potential discriminations arising from it.

It’s not hard to see how this could apply in educational contexts. For example, algorithms that use datasets from disciplines that are currently more male dominated might rank women as less likely to succeed, and algorithms that have been trained with data that consists overwhelmingly of students of a specific nationality and very few internationals might mark international students’ work lower. There are probably ways to prevent this, but awareness of potential bias is needed for this to be done. All this considered, educators seeing AI as a tool that is free of bias would be rather worrying. Understanding the potential issues there is key in retaining control.

How Artificial Intelligence Can Change Higher Education (notes from Michael Marcinkowski)

For this meeting, I read ‘How Artificial Intelligence Can Change Higher Education,’ a profile of Sebastian Thrun. The article detailed Thrun’s involvement with the popularization of massive open online courses and the founding of his company, Udacity. Developed out of Thrun’s background working at Google in the field of artificial intelligence, Udacity looks to approach the question of education as a matter of scale: how can digital systems be used to vast numbers of people all over the world. For Thrun, the challenge for education is how it can be possible to develop student mastery of a subject through online interactions, while at the same time widening the pathways for participation in higher education.

The article, unfortunately, focused most on the parallels between Thrun’s work in education and in his involvement with the development of autonomous vehicles, highlighting the potential that artificial intelligence technologies have for both, while avoiding any discussion of the particulars of how this transformational vision might be achieved.

Nevertheless, the article still opened up some interesting concerns around questions of scale and how best of approach the question of how education might function at a scale larger than as traditionally conceived. At the heart of this question is the role that autonomous systems might have in helping to manage this kind of large scale educational system. That is, at what point and for what tasks is it appropriate to take human educators out of the loop or to place them in further remove from the student. In particular, areas such as the monitoring of student well-being and one-on-one tutoring came out as areas ripe for both innovation and controversy.

While it was disappointing that the article largely avoided the actual issues of the uses of artificial intelligence in education, it did offer an unplanned for lesson about AI in education. Like in the hype surrounding self-driving cars, the promises for a new educational paradigm that were put forward in this 2012 article still seem far off. While the mythos of the Silicon Valley innovator might cast Thrun as a rebel who is singularly able to see the true path forward for education, most of his propositions for education, when they were not pie-in-sky fantasies, repeated well worn opinions present throughout the history of education.

Suggested reading

Tiddlywinks of teaching – materials from Playful Learning 19

Chrysanthi and I ran a session at the Playful Learning conference, play testing a game we have developed to help consider issues around accessibility and inclusivity. The title of our session was The Tiddlywinks of Teaching.

A first draft of the materials, all Creative Commons licenced, is now available for anyone who is interested: Tiddlywinks of Teaching materials (zip, 3MB).

We will post more about the game when time allows!

Playful Learning 19: mega games, promoting play, and wellbeing

It’s a week since I returned from my three days in leafy Leicester at the Playful Learning conference. It’s an event I have watched from a distance with envy in previous years, so I was very excited to be able to attend, and to play-test a game Chrysanthi Tseloudi and I developed around accessibility and inclusivity.

Some highlights and useful takeaways:

  • Mega games – Darren Green and Liz Cable ran a Climate Crisis mega game: a simulation of negotiations between countries around reducing carbon emissions. This session was for about 20 people but would have scaled well for much larger numbers. It was fascinating and absorbing. You would need some caution about what lessons students would take away – if you asked me what I learnt I’d have to say: China are key to solving the crisis but impossible to work with (which is obviously down to the way the players interpreted their roles) and I’m too gullible (which sadly is not). Even so, I can see real possibilities for this.
  • Promoting play in HE – I love the sound of the University of Winchester’s festival of play and creativity. At Bristol we have our Learning Games Lunches a few times a year but a festival allows so much more scope to innovate, play test, and to take ideas directly to and from the students.
  • Play for all – There were differing views around whether play had to be voluntary or not, which is obviously an important issue if you are trying to incorporate play within HE, and particularly within the taught curriculum. Reflecting on the kinds of sessions at the conference that worked well for me, and those that didn’t quite, I’m increasingly persuaded that you can only invite people to play and you can’t require them. Maybe providing choice within a set of playful options, so that people retain a sense of ownership or control, would be enough.

I was expecting – hoping I suppose – the conference would introduce me to new game mechanics for use in teaching, and maybe some facilitation ideas. In the end, the more significant focus for me was around wellbeing. It can be too easy to feel invisible and without agency, not part of anything. At Playful Learning everything was very active and collaborative. For three solid days I felt both seen and heard (a phrase which sounds rather corny to my ears but I can’t think of a more accurate one to describe the feeling). Being so connected was hard work at times but a very positive experience.

The idea of play as an indicator of wellbeing was introduced in by Alison James in her keynote. She mentioned that animals who are sick or scared can’t play. I now wonder how much play can promote or amplify wellbeing. Can behaving in a playful way sometimes trick you into being more well? I’m reminded of the work of Clowns Without Borders, taking laughter to children who you might imagine couldn’t benefit.

By the time Friday morning came and it was our turn to present, my feeling was that we were addressing a room of supportive friends. Not people who would never criticise, we got some very useful criticism, but friends all the same. This building of community and connection – both for students and staff – is a key thing that playfulness and games could bring to universities.

(Yellow-team-lego photo shamelessly stolen from @malcolmmurray – but myself and two mysterious strangers (or people whose names I have forgotten) built the thing so I’m hoping that’s ok.)

 

 

I