Where are we now? – notes from the reading group

For our first reading group since March, and our first ever online, we looked at any recent (post-COVID) articles on education. It was a somewhat eclectic selection but it was very good to be back together!

Moving Into the Long Term By Lilah Burke and A Renewed Focus on the Practice of Teaching by Shigeru Miyagawa and Meghan Perdue (notes by Suzi Wells) 

These two short articles reflected on the staff (and student) experience of teaching since March. 

Miyagawa and Perdue interviewed more than 30 faculty members at MIT about their experiences. The themes of their responses seem familiar from our experience in Bristol:

  • Many staff voiced an increased interest in the practice of teaching
  • Teaching has been more challenging and at times more rewarding – the crisis has forced us to come up with creative solutions to problems, which can be exciting
  • COVID has forced us to re-evaluate what is important, being unable to rely on face-to-face where we (think we) already know what works
  • Testing students online is harder and staff are questioning why and how much it is needed

A lot of what was covered in the Burke article is not surprising: students (and academics) feeling more isolated, and struggling with the difference between their expectations and where we now find ourselves. One of the people interviewed raised the point that so much has changed it will be hard to measure whether learning has suffered (or indeed improved). This seemed interesting to me and made me wonder what we can meaningfully measure, and in particular whether we can measure or evaluate what we learn from just dealing with a crisis like this.

How universities can ensure students still have a good experience, despite coronavirus (notes by Chrysanthi Tseloudi)

The article suggests 3 things universities can do to improve students’ experience during coronavirus (and in general).

  1. Listen: Survey students regularly, make changes based on the answers and communicate these to students.
  2. Communicate: via multiple channels (email is not the best for students), explain from a student’s point of view, tailored to different students.
  3. Invest: in hardware, software, networking capacity, staff training to ensure quality, consistency and innovation.

Just in time CPD by Virna Rossi (notes by Michael Marcinkowski)

This piece offered personal reflections on support strategies for helping teaching staff adapt to online teaching in the wake of COVID-19. The author highlighted the use of a staff-wide chat built into the University’s VLE and detailed the trials and tribulations of trying to answer questions posted by staff in video form. Though mostly a personal reflection on the processes, this piece did contain a number of salient details:

  1. The author tried to use video responses to questions in order to evoke a sense of being present there with teaching staff. Well being, both for staff and students was a prime concern, as evidenced by the questions and utilization of support materials related to well being, though it remains an open question whether or not the use of video in this case had its intended impact. What can be said is that the author found the process of video production to be time consuming.
  2. They also consciously used “low tech” aspects in their demonstrations of online teaching for staff which the utilized with the belief that they would make the staff feel more comfortable about making less-than-perfect resources. This included creating hand drawn slides for use in video presentations.

Overall, the article was an interesting read in the personal detail that it provided, however it had little substantive advice to build on, outside of the general claim regarding the importance of support and a concern for staff well-being. 

Designing out plagiarism for online assessment (notes by Hannah Gurr)

246 reasons to cheat: outsourcing from essay mills is a way for students to ‘quit’ without losing the qualification they were working towards. So may turn to this type of cheating due to an inability to handle academic workload or an unwillingness to do so.

HE Institutions need to know why plagiarism happens, while students need to come to understand the range of ways in which plagiarism can occur. HEIs need a developmental approach in formative assignments to help students know how to avoid plagiarism. The academic community also needs to place a positive focus on academic integrity (e.g. UoB 6 values of honesty, trust, fairness, respect, responsibility, courage), not just a negative focus on misconduct.

A Different Way to Deliver Student Feedback (lessons from the performing arts for STEM) (notes by Chrysanthi Tseloudi)

Tough-love feedback on open-ended work usually doesn’t work well. Students don’t receive it well and may feel alienated, while instructors often shift the blame to them for not being able to handle critical feedback.

The method described (based in arts, but in this article aimed at STEM) attempts to shift the dynamics and give the student power over the feedback they receive. It features 3 roles and 4 steps:

Roles: the artist (student), the responder (instructor/ student peer/ feedback giver, etc) and the facilitator (neutral party, optional).

Steps: 

  1. Statements of Meaning: Responders provide positive feedback about something they found meaningful, interesting, or exciting in the work.
  2. Artist as Questioner: The student asks questions about their work, focusing on the feedback they need at the moment and responders reply to these questions.
  3. Neutral Questions: Responders ask neutral questions (questions without hidden comments/ opinions) about the work, and the student responds.
  4. Opinion Time: Responders can give any other feedback they want – but only if given permission by the student. Students often don’t feel they can say no, so they will need to be reassured that they can.

Writer’s takeaway: Even if not using this method, it’s useful to ask the student what particular feedback they want at that moment. They may be surprised, as many have never been asked before. It will take them a bit of time to get used to it. But once they feel secure, tough love won’t be needed for their work to improve.

Virtual Learning Should and Can Be Hands-On (focus on labs) by Alexis R. Abramson (notes by Paddy Uglow)

Course leaders at Dartmouth College were able to keep the hands-on learning qualities of their engineering courses in the following ways:

  • $200 mini 3D printers were sent to students
  • Some lab equipment was adapted for remote operation
  • Hardware kits were sent to students containing cheap components that could be used to carry out experiments and demonstrate principals.
  • Students and staff used their imagination and home resources to replace lab-based equipment

The Reading Group discussed the article, and talked about the advantages of these methods and the use of VR video (of experiments and medical procedures). These included:

  • A real sense of “getting your hands dirty” (eg leaking chemicals, mistakes in following procedure, spillages, etc) which can’t be replicated with a computer-based version (it would be interesting to compare student performance between those learning virtually and physically – medical students practice injections on oranges, for example)
  • There’s no queuing for equipment or being unable to see properly when a demonstration is given
  • Lab experiments are often done in groups, and sometimes one person rushes ahead and doesn’t let the rest of their group gain a full understanding of what’s happening. Working at home with a kit, each student has to do it themselves, or at least gain the learning experience of why they’ve been unable to do it.

During the discussion, it was found that University of Bristol has been using similar techniques.

National Institute for Digital Learning good reads from 2019 – notes from the reading group

Do MOOCs contribute to student equity and social inclusion? A systematic review by Sarah R Lambert (read by Suzi Wells)

This study is a large literature review looking at both empirical research and practitioner guidance around using MOOCs and other open (as in free) learning to promote student equity and social inclusion. The study starts from 2014 because that’s the second wave of MOOCs, where more stable and sustainable practice begins to emerge.

The aims of the MOOCs were broken into two broad areas: those focused on student equity (making education fairer for tertiary learners and those preparing to enrol in HE) – this was the most common aim; and those who sought to improve social inclusion (education of non-award holders & lifelong learners). The literature was predominantly from US, UK, Australia, but they were only studying literature – possibly unsurprisingly for articles written in English. There was a near 50/50 split between empirical research and policy/practice recommendations, and the studies focused slightly more on MOOCs than on other other open learning. One notable finding was that the success rate (among published studies at least) was high – more often than not they met or exceeded their aims.

Lambert includes lots of useful detail about factors that may have led to successful projects. MOOC developers should learn about their learners and make content relevant and relatable to them. Successful projects often had community partners involved in the design, delivery & support – in particular, initiatives with large cohorts (~100) that were very successful all had this. Designing for the learners meant things like: designing for mobile-only and offline access, teaching people in their own language (or at least providing mother-tongue facilitation) and, where relevant, mixing practitioners with academics in the content.

Facilitating and support the learning was also key to success. Local study groups or face-to-face workshops were used by some projects to provide localisation and contextualisation. Facilitators would ideally be drawn from existing community networks.

A related point was to design content from scratch – recycling existing HE materials was not as successful. This should be done in an interdisciplinary team and/or community partnership. Being driven entirely by an IT or digital education team was an indicator that a project would not meet its aims. Projects need technical expertise but education and/or widening participation too. Open as is free-to-use is fine, licence didn’t seem to have an impact.

In short:

  • Work with the people you intend to benefit.
  • Create, don’t recycle.
  • Don’t expect the materials to stand by themselves.

If you’re interested in social justice through open learning, think OU not OERs.

What does the ‘Postdigital’ mean for education? Three critical perspectives on the digital, with implications for educational research and practice by Jeremy Knox (read by Suzanne Collins)

This article explores the idea of what ‘post-digital’ education means, specifically thinking about human-technology relationships. It begins with an analysis of the term post-digital’, embracing the perspective of ‘post’ as a critical appraisal of the understanding of digital rather than simply meaning a different stage, after, the digital. This initial analysis is worth a read, but not my main focus for this reading group so here I’ll jump straight to the main discussion, which is based on three critical perspectives on digital in education.

The first is “Digital as Capital”. Here, Knox talks about the commercialisation and capitalisation of digital platforms, such as social media. This platform model is increasingly based on the commodification of data, and so inevitably students/teachers/learners become seen as something which can be analysed (eg learning analytics), or something under surveillance. If surveillance is equated to recognition, this leads to further (perhaps troubling?) implications. Do you need to be seen to be counted as a learner? Is learning always visible? Does this move away from the idea of the web and digital being ‘social constructivist’?

Secondly, Knox looks at “Digital as Policy”. This (for me) slightly more familiar ground discusses the idea that ‘digital’ education is no longer as separate or distinct from ‘education’ as it once was. In a ‘post-digital’ understanding, it is in fact mainstream rather than alternative or progressive. The digital in education, however, often manifests as metrification in governance – eg schools are searchable in rankings based on algorithms. In this sense, ‘digital education’ moves away from ‘classroom gadgets’ (as Knox puts it) and sees it as something intrinsic and embedded in policy, with strategic influence.

Lastly, he discusses “Digital as Material”, which focuses on surfacing the hidden material dimensions of a sector which was often seen as ‘virtual’ and therefore ‘intangible’. The tangible, material aspects of digital education include devices, servers, and other physical elements which require manual labour and material resources. On one hand there is efficiency, but on the other there is always labour. As education, particularly digital education, often comes from a sense of social egalitarianism and social justice, this is a troubling realisation, and one which lead to a rethink in the way digital education is positioned in a post-digital perspective.

In conclusion, Knox suggests that ‘post-digital’ should be understood as a ‘holding to account of the many assumptions associated with digital technology’, which I feel sums up his argument and is probably something we should try and do more of regardless of whether we’re a ‘digital’ or ‘post-digital’ education office.

What’s the problem with learning analytics? by Neil Selwyn (read by Michael Marcinkowski)

For this last session I read Neil Selwyn’s ‘What’s the Problem with Learning Analytics’ from the Journal of Learning Analytics. Appearing in a journal published by the Society for Learning Analytics Research, Selwyn’s socio-technical approach toward the analysis of learning analytics was a welcome, if somewhat predictable, take on a field that too often seems to find itself somewhat myopically digging for solution to its own narrow set of questions.

Setting learning analytics within a larger social, cultural, and economic field of analysis, Selwyn lays out an orderly account of a number of critical concerns, organized around the implications and values present in learning analytics.

Selwyn lists these consequences of learning analytics as areas to be questioned:

  1. A reduced understanding of education: instead of a holistic view of education it is reduced to a simple numerical metric.
  2. Ignoring the broader social contexts of education: there is a danger that by limiting the understanding of education that we ignore important contextual factors affecting education.
  3. Reducing students’ and teachers’ capacity for informed decision-making: the results of learning analytics comes to overtake other types of decision making.
  4. A means of surveillance rather than support: in their use, learning analytics can have more punitive rather than pedagogical implications.
  5. A source of performativity: students and teachers each begin to focus on achieving results that can be measured by analytics rather than other measures of learning.
  6. Disadvantaging a large number of people: like any data driven system, decisions about winners and losers can be unintentionally baked into the system.
  7. Servicing institutional rather than individual interests: the analytics has more direct benefit for educational institutions and analytic providers than it does for students.

He goes on to list several questionable values embedded in learning analytics:

  1. A blind faith in data: There is a risk that there is a contemporary over-valuation of the importance of data.
  2. Concerns over the data economy: What are the implications when student data is monetized by companies?
  3. The limits of free choice and individual agency: Does a reliance on analytic data remove the ability of students and educators to have a say in their education?
  4. An implicit techno-idealism: Part of leaning analytics is a belief in the benefits of the impacts of technology.

Toward this, Selwyn proposes a few broad areas for change designed to improve learning analytics’ standing within a wider field of concern:

  1. Rethink the design of learning analytics: allow for more transparency and customization for students.
  2. Rethink the economics of learning analytics: give students ownership of their data.
  3. Rethink the governance of learning analytics: establish regulatory oversite for student data.
  4. A better public understanding of learning analytics: educate the wider public of the ethical implications of the application of learning analytics to student data.

Overall, Selwyn’s main point remains the most valuable: the idea of learning analytics should be examined within the full constellation of social and cultural structures within which it is embedded. Like any form of data analytics, learning analytics does not exist as a perfect guide to any action, and the insights that are generated by it need to be understood as only partial and implicated by the mechanisms designed to generate the data. In the end, Selwyn’s account is a helpful one — it is useful to have such critical voices welcomed into SOLAR — but the level at which he casts his recommendations remains too broad for anything other than a starting point. Setting clear policy goals and fostering a broad understanding of learning analytics are hopeful external changes that can be made to the context within which learning analytics is used, but in the end, what is necessary is for those working in the field of learning analytics who are constructing systems of data generation and analysis to alter the approaches that they take, both in the ‘ownership’ and interpretation of student data. This enforces the need for how we understand what ‘data’ is and how we think about using it to change. Following Selwyn, the most important change might be to re-evaluate the ontological constitution of data and our connection to it, coming to understand it not as something distinct from students’ education, but an integral part of it.

Valuing technology-enhanced academic conferences for continuing professional development. A systematic literature. Professional Development in Education by Maria Spilker (read by Naomi Beckett )

This literature review gives an analysis of the different approaches taken to enhance academic conferences technologically for continued professional development. Although there have been advances and new practices emerging, a definite coherent approach was lacking. Conferences were being evaluated in specific ways that were not considering all sides.

‘Continued professional development for academics is critical in times of increased speed and innovation, and this intensifies the responsibilities of academics.’ 

This makes it more important to ensure when academics come together at a conference, there is a systematic approach to look at what they should be getting out of the time spent there. The paper suggests this is something that needs to be looked out when first starting to plan a conference, what are the values?

The paper talks about developing different learning experiences at a conference to engage staff and build their professional development. There is often little time for reflection and the paper suggests looking at more ways to include this. Using technology is an example of a way this could be done. Engagement on Twitter for example gives users another channel to discuss and network, and this takes them away from the normal traditional conference formats.  Making more conferences online also gives users the opportunities to reach out to further networks.

The paper mentions their Value Creation Network, looking at what values we should be taking out of conferences. These include, immediate value, potential value, applied value, realised value, and re-framing value. Looking at these to begin with is a good start to thinking about how we can frame academic conferences, so delegates get the most use out of the time spent there, and work on their own professional development too.

We asked teenagers what adults are missing about technology. This was the best response by Taylor Fang (read by Paddy Uglow)

Some thoughts I took away:

  • Traditionally a “screen” was a thing to hide or protect, and to aid privacy. Now it’s very much the opposite.
  • Has society changed so much that social media is the only place that young people can express themselves and build a picture of who they are and what their place is in the world?
  • Adults have a duty to help young people find other ways to show who they are to the world (and the subsequent reflection back to themself)
  • Digital = data = monetisation: everything young people do online goes into a money-making system which doesn’t have their benefit as its primary goal.
  • Young people are growing up in a world where their importance and value is quantified by stats, likes, shares etc, and all these numbers demonstrate to them that they’re less important than other people, and encourages desperate measures to improve their metrics.
  • Does a meal/holiday/party/etc really exist unless it’s been published and Liked?
  • Does the same apply to Learning Analytics? Are some of the most useful learning experiences those which don’t have a definition or a grade?

Suggested reading

Curriculum design – notes from the reading group

Exploring curriculum design approaches (report by Suzanne Collins)

Suzanne talked about her work exploring curriculum design approaches (separate blog post) where she looks at looks at methodologies such as ABC, Carpe Diem, CAIeRO and ELDeR.

ABC Learning Design (notes by Suzi Wells)

ABC learning design is a rapid design / review methodology developed by Clive Young and Nataša Perović in 2014, and drawing on Laurillard’s ‘Conversational Framework’.

The method is centered around 90 minute workshop, during which participants:

  • Describe their unit in a tweet
  • Map out the types of activity, currently undertaken or planned, against Laurillard’s six learning types
  • Storyboard the unit using the ABC cards

In the DEO a few of us – Roger Gardner, Suzanne Collins, and I – have trialled this approach. Initially this was with a small number of academics interested in redesigning their credit-bearing units. We made much fuller use of it when supporting the design of the FutureLearn courses, following which Suzanne and I presented on this at a UCL conference in 2018: our presentation on using ABC at Bristol.

One advantage of the methodology is that you could run a single 90 minute workshop looking at an entire programme, allowing potential links between the units to become apparent. The short length of the workshop gives at least some chance to get everyone from the unit together in one place.

The cards are Creative Commons licensed and have been widely adapted, adding in activities and terminology that is more relevant to their context. On the ABC site you can download ABC cards for MOOCs designed for FutureLearn and EdX. At the conference we heard how people have used stickers in the storyboarding stage to surface things that they are interested in: employability skills, alignment with the education strategy, and university-wide themes (such as Bristol’s global citizenship, innovation & enterprise, and sustainable futures themes).

Obviously a 90 minute workshop is not going to give you time to finalise many details but ABC is quick to learn, very adaptable, and sparks good conversations. It’s remarkable how much can be done in a short time.

Beyond podcasting: creative approaches to designing educational audio (notes by Chrysanthi Tseloudi)

This paper talks about a pilot that aimed to encourage academics to use podcasts in their teaching through a tool in their VLE. The pilot included initial workshops and various types of support for the 25 participants that decided to try it out. All participants used audio, apart from 1 team, which used video podcasts. 9 of them shared their experience for this paper. They had produced various types of resources: videos about clinical techniques (nursing), audio based on research projects which also received audio feedback from the academic (sport), “questions you’re afraid to ask” (art & design), answers to distance learning students’ questions to reduce the sense of isolation in the VLE (communications), etc.

Academics enjoyed using audio for learner-centred pedagogies, but they also encountered some barriers. Expectations for high quality may be a barrier for both staff and students, while assessing student work in this format is time consuming. Not being familiar with the technology can be frustrating and impeding for staff, as they would rather not ask students to do something they didn’t feel confident they could do well themselves. Students are not necessarily more confident than them in using this technology. Following the pilot, the institution’s capacity to support such activities was evaluated and some solutions to support staff were devised.

This was a nice paper with a variety of ideas on using audio for teaching. I found the point about voice on the VLE increasing connectivity and reducing isolation particularly interesting, and would love to see any relevant research on this.

Suggested reading

Approaches to curriculum design

Related articles

Notes from the reading group – free choice

Digital wellbeing toolkit from BBC R&D (notes from Suzi Wells)

This is a toolkit from BBC R&D is designed for addressing wellbeing in digital product development. Given the increasing focus on student wellbeing, I was interested in whether it could be useful for online & blended learning.

The key resources provided are a set of values cards along with some exploration of what these mean, particularly for young people (16-34 year olds). These were developed by the BBC through desk research, focus groups and surveys.

The toolkit – the flashcards especially – could certainly be used to sense-check whether very digital teaching methods are really including and supporting the kinds of things we might take for granted in face-to-face education. It could also be useful for working with students to discuss the kinds of things that are important to them in our context, and identifying the kinds of things that really aren’t.

Some of the values seem too obvious (“being safe and well”, “receiving recognition”), or baked-in to education (“achieving goals”, “growing myself”), and I worry that could be off-putting to some audiences. The language also seemed a little strange “human values” – as though “humans” were an alien species. It can feel like the target audience for the more descriptive parts would be someone who has never met a “human”, much less a young one. Nonetheless, the flashcards in particular could be a useful way to kick off discussions.

Three example flash cards showing the values: being inspired, expressing myself, having autonomy

New studies show the cost of student laptop use in lecture classes (notes from Michael Marcinkowski)

The article I read for this session highlighted two new-ish articles which studied the impact that student laptop use had on learning during lectures. In the roughly ten years that laptops have been a common sight in lecture halls, a number of studies have looked at what impact they have on notetaking during class. These previous studies have frequently found a negative association with laptop use for notetaking in lectures, not only for the student using the laptop, but also for other students sitting nearby, distracted by the laptop screen.

The article took a look at two new studies that attempted to tackle some of the limitations of previous work, particularly addressing the correlative nature of previous findings: perhaps low performing students prefer to use laptops for notetaking so that they can do something else during lectures.

What bears mentioning is that there is something somewhat quaint about studying student laptop use. In most cases, it seems to be a foregone conclusion and there is no getting it back into the box. Students will use laptops and other digital technologies in class — there’s almost no other option at this point. Nevertheless, the studies proved interesting.

The first of the highlighted studies featured an experimental set up, randomly assigning students in different sections of an economics class to different conditions: notetaking with laptop, without laptop, or with tablet laying flat on the desk. The last condition was designed to test the effect of students’ being distracted by seeing other students’ screens; the supposition being that if the tablet was laid flat on a desk, it wouldn’t be visible to other students. The students’ performance was then measured based on a final exam taken by students across the three conditions.

After controlling for demographics, GPA, and ACT entrance exam scores, the research found that performance was lower for students using digital technologies for notetaking. However, while performance was lower on the multiple choice and short answer sections of the exam, performance on the essay potion of the exam was the same across all three conditions.

While the study did address some shortcomings of previous studies (particularly with its randomized experimental design), it also introduced several others. Importantly it raised questions about how teachers might teach differently when faced with a class of laptop users or what effect forcing a student who isn’t comfortable using a laptop might have on their performance. Also, given that multiple sections of an economics class was the subject of the study, what role does the discipline being lectured on play in the impact of laptop use?

The second study attempted to address those though a novel design which linked students’ propensity to use or not use laptops in optional-use classes based on whether or not they were forced to or forced not to use them in another class on the same day. Researchers looked at institution-wide student performance at an institution that had a mix of classes which required, forbade, or had no rules about laptop use.

By looking at student performance in classes in which laptop use was optional, but by linking that performance to whether students would be influenced in their laptop choices based on other classes held the same day, researchers wanted to be able to measure student performance when they had a chance not to use a laptop in class. That is, the design allowed researchers to understand in general how many students might be using a laptop in a laptop-optional class, but still allowing individual students to make a choice based on preference.

What they found was that student performance worsened for classes that shared a day with laptop mandated classes and improved on days when classes were shared with laptop prohibited classes. This is in line with previous studies, but interestingly, the negative effects were seen more strongly in weaker students and in quantitative classes.

In the end, even while these two new studies reinforce what had been previously demonstrated about student laptop use, is there anything that can be done to counter what seem to be the negative effects of laptop use for notetaking? More than anything, what seems to be needed are studies looking at how to boost student performance when using technology in the classroom.

StudyGotchi: Tamagotchi-Like Game-Mechanics to Motivate Students During a Programming Course & To Gamify or Not to Gamify: Towards Developing Design Guidelines for Mobile Language Learning Applications to Support User Experience; 2 poster/ demo papers in the EC-TEL 2019 Proceedings. (notes from Chrysanthi Tseloudi)

Both papers talk about the authors’ findings on gamified applications related to learning.

The first regards the app StudyGotchi, based on Tamagochi (a virtual pet the user takes care of), which aims to encourage first year Java programming students to complete tasks on their Moodle platform in order to keep a virtual teacher happy. Less than half the students downloaded the relevant app, with half of those receiving the control version that didn’t have the game functions (180) and half receiving the game version (194). According to their survey, of those that didn’t download it, some reported that was because they weren’t interested in it. Of those that replied to whether they used it, less than half said they did. According to data collected from students that used either version of the app, there was no difference in either the online behaviour or the exam grades of the students, between the groups that used the game or non-game versions. The authors attribute this to the lack of interaction, personalisation options and immediate feedback on the students’ actions on Moodle. I also wonder whether a virtual teacher to be made happy in particular is the best choice of “pet”, when hopefully there is already a real teacher supporting students’ learning. Maybe a virtual brain with regions that light up when a quiz is completed or any non-realistic representation connected to the students’ own development would be more helpful in increasing students’ intrinsic motivation, since ideally they would be learning for themselves, and not to make someone else happy.

The second paper compares 2 language learning apps, one of which is gamified. The non-gamified app (LearnIT ASAP) includes exercises where students fill in missing words, feedback to show whether the answer is correct/ incorrect and statistics to track progress. The gamified app (Starfighter) includes exercises where the students steer through an asteroid field by selecting answers to given exercises and a leaderboard to track progress and compete with peers. The evaluation involved interviewing 11 20-50 year old individuals. The authors found that younger and older participants had different views about the types of interactions and aesthetics of the two apps. Younger participants would have preferred swiping to tapping, older participants seemed to find the non-gamified app comfortable because it looked like a webpage, but were not so sure about the gamified app. The game mechanics of Starfighter were thought to be more engaging, while the pedagogical approach of LearnIT ASAP was thought to be better in terms of instructional value and effectiveness. While the authors mention that the main difference between the apps is gamification, considering the finding that the pedagogical approach of one of the apps is better, I wonder if that is actually the case. Which game elements actually improve engagement and how is still being researched, so I would really like to see comparisons of language learning apps where the existence or not of game elements is indeed the only difference. Using different pedagogical approaches between the apps is less likely to shed light on the best game elements to use, but it does emphasize the difficulty of creating an application that is both educationally valuable and fun at the same time.

AI and education – notes from reading group

Will Higher Ed Keep AI in Check? (notes from Chrysanthi Tseloudi)

In this article, Frederick Singer argues that whether the future of AI in education is a positive or a dystopian one depends not on a decision to use or not use AI, but on retaining control over how it is used.

The author starts by mentioning a few examples of how AI is/ may be used outside of education – both risky and useful. They then move on to AI’s use in educational contexts, with examples including using an AI chat bot for students’ queries regarding enrolment and financial issues, as well as AI powered video transcription that can help accessibility. The area they identify has the most potential for both risk and benefit is AI helping educators address individual students’ needs and indicating when intervention is needed; there are concerns about data privacy and achieving the opposite results, if educators lose control.

The final example they mention is using AI in the admissions process, to sidestep human biases and help identify promising applicants, but without automatically rejecting students that are not identified as promising by the AI tool.

I think this is something to be cautious about. Using AI for assessment – whether for admission, to mark activities, progress, etc – certainly has potential, but AI is not free of human biases. In fact, there have been several examples where they are full of them. The article Rise of the racist robots – how AI is learning all our worst impulses and Cathy O’Neil’s Ted talk The era of blind faith in big data must end report that AI algorithms can be racist and sexist, because they rely on datasets that contain biases already; e.g. a dataset of successful people is essentially a dataset of past human opinions of who can be successful, and human opinions are biased –  if e.g. only a specific group of people have been culturally allowed to be successful, a person that doesn’t belong to that category will not be seen by AI as equally (or more) promising as those who do belong to it. AI algorithms can be obscure, it is not necessarily obvious what they are picking up on to make their judgements and so it’s important to be vigilant and for the scientists who make them to implement ways to counteract potential discriminations arising from it.

It’s not hard to see how this could apply in educational contexts. For example, algorithms that use datasets from disciplines that are currently more male dominated might rank women as less likely to succeed, and algorithms that have been trained with data that consists overwhelmingly of students of a specific nationality and very few internationals might mark international students’ work lower. There are probably ways to prevent this, but awareness of potential bias is needed for this to be done. All this considered, educators seeing AI as a tool that is free of bias would be rather worrying. Understanding the potential issues there is key in retaining control.

How Artificial Intelligence Can Change Higher Education (notes from Michael Marcinkowski)

For this meeting, I read ‘How Artificial Intelligence Can Change Higher Education,’ a profile of Sebastian Thrun. The article detailed Thrun’s involvement with the popularization of massive open online courses and the founding of his company, Udacity. Developed out of Thrun’s background working at Google in the field of artificial intelligence, Udacity looks to approach the question of education as a matter of scale: how can digital systems be used to vast numbers of people all over the world. For Thrun, the challenge for education is how it can be possible to develop student mastery of a subject through online interactions, while at the same time widening the pathways for participation in higher education.

The article, unfortunately, focused most on the parallels between Thrun’s work in education and in his involvement with the development of autonomous vehicles, highlighting the potential that artificial intelligence technologies have for both, while avoiding any discussion of the particulars of how this transformational vision might be achieved.

Nevertheless, the article still opened up some interesting concerns around questions of scale and how best of approach the question of how education might function at a scale larger than as traditionally conceived. At the heart of this question is the role that autonomous systems might have in helping to manage this kind of large scale educational system. That is, at what point and for what tasks is it appropriate to take human educators out of the loop or to place them in further remove from the student. In particular, areas such as the monitoring of student well-being and one-on-one tutoring came out as areas ripe for both innovation and controversy.

While it was disappointing that the article largely avoided the actual issues of the uses of artificial intelligence in education, it did offer an unplanned for lesson about AI in education. Like in the hype surrounding self-driving cars, the promises for a new educational paradigm that were put forward in this 2012 article still seem far off. While the mythos of the Silicon Valley innovator might cast Thrun as a rebel who is singularly able to see the true path forward for education, most of his propositions for education, when they were not pie-in-sky fantasies, repeated well worn opinions present throughout the history of education.

Suggested reading

Curriculum theories – notes from reading group

Thanks to Sarah Davies for setting us some fascinating reading!

Connected curriculum chapter 1 (notes from Chris Adams)

The connected curriculum is a piece of work by Dilly Fung from UCL. It is an explicit attempt to outline how departments in research-intensive universities can develop excellent teaching by integrating their research into it; the ‘connected’ part of the title is the link between research and teaching. At it’s heart is the idea that the predominant mode of learning for undergraduates should be active enquiry, but that rather than students discovering for themselves things which are well-established, they should be discovering things at the boundaries of what is known, just like researchers do.

It has six strands:

  • Students connect with researchers and with the institution’s research. Or, in other words, the research work of he department is explicitly built into the curriculum
  • A throughline of research activity is built into each programme. Properly design the curriculum so that research strands run though it, and it builds stepwise on what has come before.
  • Students make connections across subjects and out to the world. Interdisciplinarity! Real world relevance.
  • Students connect academic learning with workplace learning. Not only should we be teaching them transferable skills for a world of rapid technological change, but we need to tell them that too.
  • Students learn to produce outputs – assessments directed at an audience. Don’t just test them with exams
  • Students connect with each other, across phases and with alumni. This will create a sense of community and belonging.

This last point is then expanded upon. Fung posits that the curriculum is not just a list of what should be learned, but is the whole experience as lived by the student. Viewing the curriculum as a narrow set of learning outcomes does not product the kind of people that society needs, but is a consequence of the audit culture that pervades higher education nowadays. Not all audit is bad – the days when ‘academic freedom’ gave people tenure and the freedom to teach terribly and not do any research are disappearing, and peer-review is an integral part of the university system – but in order to address complex global challenges we need a values based curriculum ‘defined as the development of new understandings and practices, through dialogue and human relationships, which make an impact for good in the world.’

I liked it sufficiently to buy the whole book. It addresses a lot of issues that I see in my own department – the separation of research from teaching, and the over-reliance on exams, and the lack of community, for example.

Connected curriculum chapter 2 (notes from Suzi Wells)

As mentioned in chapter 1, the core proposition is that the curriculum should be ‘research-based’ – ie most student learning “should reflect the kinds of active, critical and analytic enquiry undertaken by researchers”.

Fung gives a this useful definition of what that means in practice. Students should:

  • Generate new knowledge through data gathering and analysis
  • Disseminate their findings
  • Refine their understanding through feedback on the dissemination

All of it seems fairly uncontroversial in theory and tends to reflect current practice, or at least what we aspire to in current practice. There’s some discussion of the differences in what research means to different disciplines, and how that filters through into assessment of students, and potentially some useful studies on just how effective this all is.

Fung mentions the Boyer Commission (US 1998) and its proposed academic bill of rights, including (for research intensive institutions): “expectation of and opportunity for work with talented senior researchers to help and guide the student’s efforts”. Given increasing student numbers, this is possibly a less realistic expectation to meaningfully meet than it once was.

There’s some useful discussion about what is needed to make research-based-teaching work.

I was particularly interested in the idea that providing opportunity for this form of learning isn’t everything. Socio-economic factors mean that students may have differing beliefs about their own agency. Fung cites Baxter-Magdola (2004) on the importance of students having ‘self-authorship’ which includes ‘belief in oneself as possessing the capacity to create new knowledge’ and ‘the ability to play a part within knowledge-building communities’. You can’t assume all students arrive with the same level of this, and this will affect their ability to participate.

This part of the chapter also talks about the importance of not just sending students off “into the unknown to fend for themselves” – imagine a forest of ivory towers – but to give them support & structure. Activities need to be framed within human interactions (including peer support).

Towards the end there is a nod to it being anglo-centric – African and Asian educational philosophy and practice may be different – but little detail is given.

How Emotion Matters in Four Key Relationships in Teaching and Learning in Higher Education” (notes from Roger Gardner)

This is a 2016 article by Kathleen Quinlan, who is now Director of the Centre for the Study of Higher Education and Reader in Higher Education at University of Kent, but was working at Oxford when this was written.

She writes that while historically there has been less focus on Bloom’s affective domain than the cognitive, recently interest in the relation of emotions to learning has been growing although it is still under-researched. The article comes out of a review of the existing literature and conversations with teachers at the National University of Singapore in August 2014.

The paper focusses on four relationships: students with the subject matter, teachers, their peers and what she calls “their developing selves”. For each section Quinlan includes a summary of implications for teaching practice, which provide some very useful suggestions, ranging from simple things such as encouraging students to introduce each other when starting activities to help foster peer relationships, to advocating further research and exploration into when it is appropriate and educationally beneficial for teachers to express emotions and when not.

Quinlan says “discussions about intangibles such as emotions and relationships are often sidelined”, but it now seems essential to prioritise this if we are to support student wellbeing, and this paper provides some helpful prompts and suggestions for reflection and developing our practice.  If you are short of time I recommend looking at the bullet point “implications for practice”.

What is “significant learning”? (notes from Chrysanthi Tseloudi)

In this piece, Dr. Fink talks about the Taxonomy of Significant Learning; a taxonomy that refers to new kinds of learning that go beyond the cognitive learning that Bloom’s taxonomy addresses. The taxonomy of significant learning – where significant learning occurs when there is a lasting change in the learner that is important in their life – is not hierarchical, but relational and interactive. It includes six categories of learning:

Foundational knowledge: the ability to remember and understand specific information as well as ideas and perspectives, providing the basis for other kinds of learning.

Application: learning to engage in a new kind of action (intellectual, physical, social, etc) and develop skills that allow the learner to act on other kinds of learning, making them useful.

Intergration: learning to see, understand, and make new connections between different things, people, ideas, realms of ideas or realms of life. This gives learners new (especially intellectual) power.

Human Dimension: learning about the human significance of things they are learning – understanding something about themselves or others, getting a new vision of who they want to become, understanding the social implications of things they have learned or how to better interact with others.

Caring: developing new feelings, interests, values and/ or caring more about something that before; caring about something feeds the learner’s energy to learn about it and make it a part of their lives.

Learning how to learn: learning about the learning process; how to learn more efficiently, how to learn about a specific method or in a specific way, which enables the learner to keep on learning in the future with increasing effectiveness.

The author notes that each kind of learning is related to the others and achieving one kind helps achieve the others. The more kinds of learning involved, the more significant is the learning that occurs – with the most significant kind being the one that encompasses all six categories of the taxonomy.

Education Principles: Designing learning and assessment in the digital age (notes from Naomi Beckett)

This short paper is part of a guide written by Jisc. It covers what Education Principles are and why they are such a vital characteristic of any strategy. Coming from someone unspecialised in this area it was an interesting read to understand how principles can bring staff together to engage and develop different education strategies. The guide talks about how principles can ‘provide a common language, and reference point for evaluating change’.

The paper talks about having a benchmark in which everyone can check their progress. I like this idea. So often projects become too big and the ideas and values are lost on what was first decided as a team. Having a set of principles is a way to bring everything back together and is a useful way to enable a wide variety of staff to engage with each other. The guide mentions how having these principles means there is a ‘common agreement on what is fundamentally important.’

Having these principles developed at the beginning of a project puts the important ideas and values into motion and is a place to look back to when problems arise. Principles should be action oriented, and not state the obvious. Developing them in this way allows for a range of staff members to bring in different ideas and think about how they want to communicate their own message.

I also followed up by reading ‘Why use assessment and feedback principles?’ from Strathclyde’s Re-Engineering Assessment Practices (REAP) project.

Suggested reading

OU Innovating Pedagogy 2019 – notes from reading group

All read sections of the OU Innovating Pedagogies 2019 report

Learning with Robots (read by Naomi Beckett)

This short piece talked about how Robots are now being used for educational purposes and which ones are being used. The article talked a lot about how Robots can be used in a way that enhances learning by learning things themselves. Learners can teach something to the Robots as a process of showing they have accomplished a new skill and in turn the Robot is gaining new information.

The article also talked about how Robots can enable a passive approach to teaching. Robots won’t raise their voice or show (real) emotions in a session. Having this calm approach to teaching, it is argued, will now allow students to learn in a calmer environment. It also discusses how having a Robot as a learning tool may excite or motivate learners too. Although it only briefly mentions how a Robot would souly conduct a class full of students.

There were some aspects of the article that did make some sense on how Robots could aid learning, but these ideas didn’t go into much depth. It was discussed how Robots could talk in several languages so could be able to converse comfortably with a wider range of students. It also talked about how Robots could act as mediators to students, being able to check in, or provide advice at any time of the day. They could conduct the routine tasks and issues, freeing up teacher’s time so they can spend it with their learners.

As mentioned in the article ‘many people have an inherent distrust of advancing technologies.’ There are several questions to ask on how much a Robot is integrated into a learning environment, and when does it become too much. But there are a number of interesting points in the article about how Robots are making small steps to aid and enhance learning.

Reading this section got me thinking about the AV1 Robot. A robot created by NoIsolation. They created a robot to ‘reduce loneliness and social isolation through warm technology’. AVI was a Robot created for children who are too ill to go to school. The robot sits in the class and the child at home can connect through it. Using an app, the children can take part in the classroom. They can raise their hand to answer questions, talk to nearby students, ask questions, and just listen if they want to. A great use of technology to keep students engaged with their learning and classmates.

Decolonising learning (read by Sarah Davies)

This section was not about decolonising the curriculum – itself an important area for Bristol – but rather reflecting on how digital environments, tools and activities can be used in ways which invert power relationships and cultural and educational capital of the dominant culture, and support colonised or marginalised populations in education, sense-making and cultural development which is meaningful to them. It notes that decolonisation requires systematic unsettling change.

The article reminds us that we need to acknowledge the ways in which digital presence can contribute to colonisation – so digital environments created by a dominant culture may not create spaces for the kind of discussions, activities and issues which are meaningful to those of other cultures. It suggests that MOOCs can often be a form of digital colonisation – people from all over the world learn from massive courses produced in just a few countries.

In contrast, digital decolonisation considers how to support colonised, under-represented, uprooted or otherwise marginalised people with technology in order to:

  • connect them with a shared history,
  • support a critical perspective on their present,
  • provide tools for them to shape their futures.

But how to use the technology must be decided by the people themselves.

Critical pedagogies – in which students are expressly encouraged to question and challenge power structures, authority and the status quo – provide frameworks for the academic success of diverse students – eg by seeking to provide a way of maintaining their cultural integrity while achieving academic success, or to sustain the cultural competence of their community while gaining access to the dominant cultural competence.

Digital storytelling is an example of a pedagogical tool that can be used for decolonising purposes – empowering students to tell their own stories, turning a critical lens on settler colonialism, capturing stories of indigenous or marginalised people taking action on issues, critiques of colonial nations.

Two final messages from this article which resonated for me were that success in or after HE for some groups of students may be at odds with notions of success in the dominant society (as captured in things like Graduate Outcomes); and that education needs to be reimagined as an activity that serves the needs of local communities – though what that means for Bristol and the local, national and international communities it exists within, I’m not sure.

Virtual studios (read by Suzi Wells)

I found this a useful exercise in thinking about what a studio is and what it is for – and how much of that might be reimagined online. Studios are described in the report as collaborative, creative, social, communal spaces. They contain creative artefacts (sketches, models, objects). Learning in studios is by doing and is often peer-supported with tutors facilitating and guiding rather than instructing.

The report describes virtual studios as being focused on digital artefacts. “Virtual studios are all about online exchange of ideas, rapid feedback from tutors and peers, checks on progress against learning outcomes, and collaboration”

The first benefit of virtual studios given is scale: a studio can be for 100s of learners. This left me wondering if this is in conflict with the idea of studios as a community.

Virtual studios are also described as “hubs”, an idea I would have liked to explore further. I wanted to know how a hub is different from a community. What are we trying to achieve when we make something hub-like? I suppose a hub is a place which provides a starting point or a loose join between disparate activities or organisations. It’s not just a community, but has other communities floating around it.

Virtual studios can be a way to give more people (fully open even) access to experts and facilities. Example given was the (oft cited, so fairly unique?) ds106 Digital Storytelling.

Areas to explore further:

  • Could e-portfolios benefit from being grounded in something more virtual-studio-like (how much are they doing that already)?
  • How big can a virtual studio be before it loses the community feeling? Is there a way to scale community?

Place Based Learning (read by Michael Marcinkowski)

While the article on place based learning only provided a surface view of the approach, I found it very interesting in two distinct ways.

First, it focused on place based learning as not being solely the province of lessons conducted in the field, away from the classroom. What was highlighted in the article was the way that place based learning could just as easily take place in the classroom with students studying their local communities or local histories from their desks. Whether in the classroom or the field, the focus is on how students are able to make robust connections between their personal situation and their learning.

This kind of connection between the learner and their local community provides the foundation for the second point of interest in the article: that place-based learning can easily incorporate aspects of critical pedagogy. As students explore their local communities, they can both explore critical issues facing the community and build on their own experiences in order to support their learning. One example that was noted was having students explore the function of public transportation networks in their community, looking at questions of availability, accommodation, and planning.

An important development in place based learning has been the rise in the ubiquity of smartphones and other location-aware devices. By tapping into GPS and other forms of location networks, it becomes possible to develop applications that allow learners to dynamically access information about their surroundings. The article mentions one project that allows language learners to access vocabulary specific to the locations in which it is used, for instance, having transit based vocabulary guides triggered near bus stops. The idea is that such systems allow for the in-situ acquisition of vocabulary in a way which is both useful in the moment and that reinforces learning.

There are already a number of good examples of place based learning that have been developed out of the University of Bristol, including the Bristol Futures Course which encourages students to explore and engage with the wider city of Bristol and the Romantic Bristol smartphone app which highlights places of historic and literary importance around the city.

Particularly as the University begins to confront its legacy of involvement with the slave trade, there look to be a number of ways in which place based education can continue to be fostered among the University community.

Roots of Empathy (read by Chrysanthi Tseloudi)

This section describes a classroom programme that aims to teach children empathy, so they can have healthy and constructive social interactions.

In this programme, children between 5-13 years old get visits in their school class every 3 weeks from a local baby, their parent and a Roots of Empathy instructor. The children observe how the baby and its feelings develop and its interactions with the parent. With the guidance of the instructor, the children learn about infant development and identify the baby’s feelings, their own and those of others; they then reflect on them, describe and explain them. There are opportunities for discussion and other activities, including the children recording songs for their baby and reflecting on what they would like the baby’s future to be like. The curriculum is broken down into themes, which are then broken down further into age ranges. While the activities focus on feelings, some use knowledge and skills from school subjects, e.g. mathematics. Research on the programme has shown positive results in decreasing aggression and increasing positive social behaviours.

It was interesting to read about this approach. Something that stood out for me was that while the learners identifying their own feelings is mentioned, it is not obvious if this is an explicit aim of this programme. That made me wonder whether it is assumed that a person that is able to identify others’ feelings is definitely able to identify their own (in which case this programme addresses this skill implicitly), whether it is assumed that the children are able to do this already or whether knowing one’s own feelings is not considered an important skill in healthy social interactions. I also wondered how children that have significant difficulties identifying their own or others’ feelings fare in this programme and if/ how they are further supported.

JISC Horizon Report on wellbeing and mental health – notes from reading group

Suzi read the first section of the JISC Horizon Report mental health and wellbeing section. This talked about the increasing demands on mental health services and discussed some possible causes including worries about money and future prospects, diet, use of social media, and reduced stigma around talking about mental health.

Many institutions are increasing their efforts around student wellbeing. The report mentioned a new task force looking at the transition to university and support in first year: Education Transitions Network.

Four technologies are mentioned as currently being used within HE:

  • Learning analytics to identify students in need of being checked in on
  • Apps and online mood diaries, online counselling
  • Peer support (overseen by counsellors) via Big White Wall
  • Chatbots

The report didn’t have a great amount of detail on how these are used. Using learning analytics to see who’s not engaging with online content seems like the simplest application and is doable in many systems but even this would require care. Revealing that you are keeping students under surveillance in a way they might not expect may cause them to lose trust in the institution and retreat further (or game the system to avoid interventions). Then again, maybe it’s just helping us know the sort of things a student might expect us to know. Universities can be quite disjointed – in a way that may not seem natural or helpful to students. Analytics could provide much needed synaptic connections.

It also struck me that using technology to support wellbeing (and even mental health) is in some ways similar to teaching: what you’re trying to achieve is not simple to define and open to debate.

Johannes read the blog post Learning Analytics as a tool for supporting student wellbeing and watched a presentation by Samantha Ahern. Samantha Ahern is a Data Scientist at the UCL and does research concerning the implications of Learning Analytics on student wellbeing.

In her presentation, she outlined the current problem of the HE Sector with student wellbeing and provided some alarming numbers about the increase of reported mental disorders of young adults (16 –24 years old). According to the NHS survey on mental health in the UK, around 8% of male and 9% of female participants had diagnosed mental health issues in the year 1992. This numbers increased to more than 19% of males and even 26% of females in 2014. Interestingly, females are much more likely to report mental health issues than males, who, unfortunately, are the ones doing most harm to themselves.

In her opinion, the HE institutions have a great responsibility to act when it comes to tackling mental health problems. However, not all activities actually support students. She argues, that too many university policies put the onus to act on the student. But the ones that would need help the most, often do not report their problems. Therefore, the universities should take a much more active role and some rethinking needs to take place.

Her main argument is, that although learning analytics is still in its beginnings and it might sound like a scary and complicated topic, it is worth doing research in this field, as it has the capabilities to really improve student wellbeing when it is done correctly.

It was very interesting to read and listen to her arguments, although it was meant to be as an introduction to learning analytics and did not provide any solutions to the issues.

Roger read “AI in Education – Automatic Essay Scoring”, referenced on page 27 of the JISC Horizons report. Is AI ready to give “professors a break” as suggested in a 2013 article from the New York Times referring to work by EdX on development of software which will automatically assign a grade (not feedback) to essays. If so then surely this would improve staff wellbeing?

Fortunately for the Mail Online, who responded to the same edX news in outraged fashion (“College students pulling all-nighters to carefully craft their essays may soon be denied the dignity of having a human being actually grade their work”) it doesn’t seem that this is likely any time soon.

Recent work from 2 Stanford researchers built on previous results from a competition to develop an automatic essay scoring tool, increasing the alignment of the software with human scorers from 81% in the previous competition to 94.5%.  This to me immediately begged the question – but how consistent are human scorers? The article did at least acknowledge this saying “assessment variation between human graders is not something that has been deeply scientifically explored and is more than likely to differ greatly between individuals.”

Apparently the edX system is improving as more schools and Universities get involved so they have more data to work with, but on their website they state it is not currently available as a service.  The article acknowledges the scepticism in some quarters, in particular the work of Les Perelman, and concludes that there is still “a long way to go”.

Chrysanthi read Learning analytics: help or hindrance in the quest for better student mental wellbeing?, which discusses the data learners may want to see about themselves and what should happen if the data suggests they are falling behind.

Learning analytics can detect signs that may indicate that a student is facing mental health issues and/ or may drop out. When using learning analytics to detect these signs, the following issues should be considered:

  • Gather student’s data ethically and focus on the appropriate metrics to see if a student is falling behind and what factors may be contributing to this.
  • Give students a choice about the data they want to see about themselves and their format, especially when there are comparisons with their cohort involved.
  • Support students at risk, bearing in mind they may prefer to be supported by other students or at least members of staff they know.
  • Talk to students about how to better use their data and how to best support them.

Chrysanthi also read the “What does the future hold” section in JISC Horizon Report Mental Health and Wellbeing, which attempts to predict how wellbeing may be handled in the next few years:

  • Within 2 years, students will have a better understanding of their mental health, more agency, increased expectations for university support and will be more likely to disclose their mental health conditions, as they become destigmatised. Institutions will support them by easing transitions to university and providing flexible, bite-sized courses that students can take breaks from. The importance of staff mental health will also be recognised. New apps will attempt to offer mental wellbeing support.
  • In 3-5 years, institutions will manage and facilitate students supporting each other. Students’ and staff wellbeing will be considered in policy and system design, while analytics will be used to warn about circumstances changing. We may see companion robots supporting students’ needs.
  • In 5 years, analytics may include data from the beginning of students’ learning journey all the way to university to better predict risks.

The Horizon group then gives suggestions to help with the wellbeing challenge, including providing guidance, offer education on learning, personal and life skills to students, and regularly consulting the student voice. Next steps will include exploring the possibility of a wellbeing data trust to enable organisations to share sensitive student data with the aim of helping students, of a wellbeing bundle of resources, apps, etc and more work on analytics, their use to help students and staff and the ethical issues involved.

Naomi read ‘Do Online Mental Health Services Improve Help-Seeking for Young People? A Systematic Review’.

This article from 2014 talks about young people using online services to look for help and information surrounding mental health. The review investigates the effectiveness of online services but does state that a lot more research needs to be done within this area. The article flits between the idea of seeking help and self-help and talks about the benefits of both. It mentions how young people now feel they should problem solve for themselves, so providing an online space for them to access useful information is a great way for them to seek help.

The review mentions how ‘only 35% of young people experiencing mental health problems seek professional face to face help’.  This statistic adds to the fact that online services are needed to provide help and assistance to those in need. It does add that young people do have improved mental health literacy and are better at recognising that they or someone they know may need help. With face to face professional help becoming increasingly harder to access more are turning to online information. It has to be said however that online help has no follow up, and young people can often be given information online, with no way to continue gaining assistance.

One interesting part of the article talked about structured and unstructured online treatment programmes. Although effective at reducing depression and anxiety, structured programmes had poor uptakes and high drop outs with no way for help to be maintained. Unstructured programmes are more useful in the sense that the user could select links that appear useful and disregard to information that seems irrelevant.

This article wasn’t student focused and only covered data collected from younger people, but the ideas behind the review are poignant in a higher education background.

Suggested reading

Jisc Horizon Report mental health and wellbeing section

Or investigate / try out one or more of the online services listed here (or any other – these just seem like helpful lists):

Or related articles

Near future teaching – notes from reading group

For our latest reading group, following Sian Bayne’s fascinating Near Future Teaching seminar for BILT, we wanted to look in more depth at the project materials and related reading.

Michael read ‘Using learning analytics to scale the provision of personalized feedback,’ a paper by Abelardo Pardo, Jelena Jovanovic, Shane Dawson, Dragan Gasevic and Negin Mirriahi. Responding to the need to be able to provide individual feedback to large classes of students, this study presented and tested a novel system for utilizing learning analytic data generated by student activity within a learning management system in order to deliver what the authors called ‘personalized’ feedback to students. As it was designed, the system allowed instructors to create small, one or two sentence pieces of feedback for each activity within a course. Based on these, each week students would be able to receive a set of ‘personalized’ feedback that responded to their level of participation. In the study, the authors found an improvement in student satisfaction with the feedback they received, but only a marginal improvement in performance, as compared to previous years. There were limits to the methodology — the study only made use of at most three years of student data for comparison — and the author’s definition of ‘personalized feedback’ seemed in practice to be little more than a kind of customized boilerplate feedback, but nevertheless the study did have a few interesting points. First, it was admirable in the way that it sought to use learning analytics techniques to improve feedback in large courses. Second, the authors took the well thought out step to not make the feedback given to be about the content of the course, but instead it focused on providing feedback on student study habits. That is, the feedback might encourage students to make sure they did all the reading that week if they weren’t doing well, or might encourage them to be sure to review the material if they had already reviewed it all once. Third, the article offered an interesting recounting of the history of the concept of feedback as it moved from focusing only on addressing the gap between targets and actual performance to a more wholistic and continuous relationship between mentor and student.

Suzi read Higher education, unbundling, and the end of the university as we know it by Tristran McCowan. This paper starts with a thorough guide to the language of unbundling and the kinds of things that we talk about when we talk about unbundling, followed by an extensive discussion of what this means for higher education. My impression from the article was that “unbundling” may be slightly unhelpful terminology, partly because it covers a very wide range of things, and partly because – if the article is to be believed – it’s a fairly neutral term for activities which seem to include asset-stripping and declawing universities. As an exploration of the (possible) changing face of universities it’s well worth a read. You can decide for yourself whether students are better off buying an album than creating their own educational mixtape.

Roger read “Future practices”.   For world 1 , human led and closed, I was concerned that lots was only available to “higher paying students” and there was no mention at all of collaborative learning. For world 2, human led and open, I liked the the idea of the new field of “compassion analytics”, which would be good to explore further, lots of challenge based learning and open content. World 3, tech led and closed, was appealing in its emphasis on wellbeing in relation to technology, and a move away from traditional assessment, with failure recognised more as an opportunity to learn, and reflection and the ability to analyse and synthesise prioritised. From world 4 I liked the emphasis on lifelong learning and individual flexibility for students eg to choose their own blocks of learning.

Chrysanthi read Future Teaching trends: Science and Technology. The review analyzes 5 trends:

  • datafication – e.g. monitoring students’ attendance, location, engagement, real-time attention levels,
  • artificial intelligence – e.g. AI tutoring, giving feedback, summarizing discussions and scanning for misconceptions, identifying human emotions and generating its own responses rather than relying only on past experience and data,
  • neuroscience and cognitive enhancement – e.g. brain-computer interfaces, enhancement tools like tech that sends currents to the brain to help with reading and memory or drugs that improve creativity and motivation,
  • virtual and augmented realities – e.g. that help to acquire medical skills for high-risk scenarios without real risk, or explore life as someone else to develop empathy, and
  • new forms of value – enabling e.g. the recording and verification of all educational achievements and accumulation of credit over one’s lifetime, or the creation of direct contracts between student-academic.

I liked it because it gave both pros and cons in a concise way. It allows you to understand why these trends would be useful and could be adopted widely, at the same time as you are getting a glimpse of the dystopian learning environment they could create if used before ethical and other implications have been considered.

Suggested reading

Feedback, NSS & TEF – notes from reading group

Chrysanthi read “Thanks, but no-thanks for the feedback”. The paper examines how students’ implicit beliefs about the malleability of their intelligence and abilities influence how they respond to, integrate and deliberately act on the feedback they receive. It does so, based on a set of questionnaires completed by 151 students (113 females and 38 males), mainly from social sciences.

Mindset: There are two kinds of mindsets regarding malleability of one’s personal characteristics; People with a growth mindset believe that their abilities can grow through learning and experience; people with a fixed mindset believe they have a fixed amount of intelligence which cannot be significantly developed. “If intelligence is perceived as unchangeable, the meaning of failure is transformed from an action (i failed) to an identity (i am a failure)” (p851).

Attitudes towards feedbackSeveral factors that influence whether a person accepts a piece of feedback – e.g. how reflective it is of their knowledge and whether it is positive or negative – were measured, as well as 2 outcome measures.

Defence mechanisms: Defence mechanisms are useful in situations we perceive as threatening, as they help us control our anxiety and protect ourselves. But if we are very defensive, we are less able to perceive the information we receive accurately, which can be counterproductive; e.g. a student may focus on who has done worse, to restore their self-esteem, rather than who has done better, which can be a learning opportunity.

The results of the questionnaires measuring the above showed that more students had a fixed mindset (86) than growth (65) and that their mindset indeed affected how they responded to and acted on feedback.

  • Growth mindset students are more likely to challenge themselves and see the feedback giver as someone who can push them out of their comfort zone in a good way that will help them learn. They are more motivated to change their behaviour in response to the received feedback, engage in developmental activities and use the defence mechanisms considered helpful.
  • Fixed mindset students are also motivated to learn, but they are more likely to go about it in an unhelpful way. They make choices that help protect their self-esteem, rather than learn, they are not as good at using the helpful defence mechanisms, they distort the facts of the feedback or think of an experience as all good or all bad. The authors seemed puzzled by the indication that fixed students are motivated to engage with the feedback, but they do so by reshaping reality or dissociating themselves from the thoughts and feelings surrounding said feedback.

Their recommendations?

  • Academics should be careful in how they deliver highly emotive feedback, even if they don’t have the time to make it good and individualised.
  • Lectures & seminars early in students’ studies, teaching them about feedback’s goal and related theory and practice, as well as action action-orientated interventions (eg coaching), so they learn how to recognize any self-sabotaging behaviours and manage them intelligently.
  • Strategies to help students become more willing to experience – and stay with – the emotional experience of failure. Eg, enhance the curriculum with opportunities for students to take risks, so they become comfortable with both “possibility” and “failure”.

I think trying to change students’ beliefs about the malleability of their intelligence would go a long way. If one believes their abilities are fixed and therefore if they don’t do well, they are a failure, a negative response to feedback is hardly surprising. That said, the responsibility of managing feedback should not fall entirely on the student; it still needs to be constructive, helpful and given in an appropriate manner.

Suzi read: An outsider’s view of subject level TEFA beginner’s guide to the Teaching Excellence FrameworkPolicy Watch: Subject TEF year 2 by the end of which she was not convinced anyone knows what the TEF is or how it will work.

Some useful quotes about TEF 1

Each institution is presented with six metrics, two in each of three categories: Teaching QualityLearning Environment and Student Outcomes and Learning Gain. For each of these measures, they are deemed to be performing well, or less well, against a benchmarked expectation for their student intake.

… and …

Right now, the metrics in TEF are in three categories. Student satisfaction looks at how positive students are with their course, as measured by teaching quality and assessment and feedback responses to the NSS. Continuation includes the proportion of students that continue their studies from year to year, as measured by data collected by the Higher Education Statistics Agency (HESA). And employment outcomes measures what students do (and then earn) after they graduate, as measured by responses to the Destination of Leavers from Higher Education survey – which will soon morph into Graduate Outcomes.

Points of interest re TEF 2

  • Teaching intensity (contact hours) won’t be in the next TEF
  • All subjects will be assessed (at all institutions), with results available in 2021
  • Insufficient data for a subject at an institution could lead to “no award” (so you won’t fail for being too small to measure)
  • Resources will be assessed
  • More focus on longitudinal educational outcomes, not (binary) employment on graduation
  • It takes into account the incoming qualifications of the students (so it does something like the “value add” thing that school rankings do) but some people have expressed concern that it will disincentivise admitting candidates from non-traditional backgrounds.
  • There will be a statutory review of the TEF during 2019 (reporting at the end of the year) which could change anything (including the gold / silver / bronze rankings)

Suzi also read Don’t students deserve a TEF of their own which talks about giving students a way in to play with the data so that, for example, if you’re more interested in graduate career destinations than in assessment & feedback you can pick on that basis (not on the aggregated data). It’s an interesting idea and may well happen but as a prospective student I can’t say I understood myself — or the experience of being at university — well enough for that to be useful. There’s also a good response talking about the kind of things (the library is badly designed, lectures are at hours that don’t make sense because rooms are at a premium, no real module choice) you might find out too late about a university that would not be covered by statistics.

Roger read “How to do well in the National Student Survey (NSS)” an article from Wonkhe,  written in March 2018. The author, Adrian Burgess, Professor of Psychology at Aston University, offers some reflections based on an analysis of NSS results from 2007 to 2016.

Whilst many universities have placed great emphasis on improving assessment and feedback, this has “brought relatively modest rewards in terms of student satisfaction” and remains the area with the lowest satisfaction.

Burgess’ analysis found that the strongest predictors of overall satisfaction were “organisation and management” closely followed by “teaching quality”.

Amy read Feedback is a two-way street. So why does the NSS only look one way?, an article by Naomi Winstone and Edd Pitt. This piece highlighted the issue that the NSS questions on feedback are framed as if feedback should be a passive experience – that students should be given their feedback. In 2017, the question was changed from “I have received detailed comments” to “I have received useful comments”. Both the old and new question frames feedback as something that is received, a ‘transmission-focussed mindset’, whereas Winstone and Pitt argue that feedback should be a two-way relationship – with the student working with the feedback and their tutor to develop.
The authors do not believe that changing the NSS question will solve all of the problems with students perception of feedback (though it will definitely help!) but they do believe that by promoting feedback as something that individuals work with, have responsibility for and seek out if they feel they need to develop in a certain area, that gradually the mindset will change and become a more sustainable form of learning for students.

Suggested reading

From WonkHE

From the last time we did assessment & feedback, which was July 2017 (I’ve left in who read what then)