Miscellany – notes from the reading group

No theme this month – just free choice. Here’s what we read (full notes below):

Naomi read Stakeholders perspectives on graphical tools for visualising student assessment and feedback data.

This paper from the University of Plymouth looks at the development and progression of learning analytics within Higher Education. Luciana Dalla Valle, Julian Stander, Karen Gretsey, John Eales, and Yinghui Wei all contributed.  It covers how four graphical visualisation methods can be used by different stakeholders to interpret assessment and feedback data. The different stakeholders being made up of external examiners, learning developers, industrialists (employers), academics and students.

The paper discusses how there is often difficulty pulling information from assessments and feedback as there can be a lot of data to cover. Having graphic visualisations means information can be shared and disseminated quickly, as there is one focal point to concentrate on. Its mentioned that some can include ‘too much information that can be difficult for teachers to analyse when limited time is available.’ But it is also discussed how it is important then to evaluate the visualisations from the point of view of the different stakeholder who may be using them.

The paper looks at how learning analytics can be seen as a way to optimise learning and allow stakeholders to fully understand and take on board the information that they are provided with. For students it was seen as a way to get the most out of their learning whilst also flagging student’s facing difficulties. The paper also talks about how it brings many benefits to students who are described as the ‘overlooked middle’. Students are able to easily compare their assessments, attainment, and feedback to see their progression. Student’s agreed that the visualisations could assist with study organisation and module choice, and it’s also suggested taking these analytics into account can improve social and cultural skills. For external examiners, analytics was seen as a real step forward in their learning and development. For them it was a quick way to assimilate information and improve their ‘knowledge, skills and judgement in Higher Education Assessment. Having to judge and compare academics standards over a diverse range of assessment types is difficult and visual graphics bring some certain simplicity to this. For learning developers too, using these images and graphics are suggested to help in ‘disseminating good practice.

The paper goes on to explain how it does improve each of the stakeholder’s evaluation of assessment. It goes into a lot of detail of the different visualisations suggested, commenting on their benefits and drawbacks of each set of data, which is worth taking a more detailed look at. It should also be noted that the paper suggested there could be confidential or data protection issues involved with sharing or releasing data such as this as in most cases this certain data is only seen at faculty or school level. Student demoralisation is also mentioned near the end of the paper as being a contributing factor to why these graphics may not always work in the best ways. It finishes by suggesting how it would be interesting to study student’s confidence and self-esteem changes due to assessment data sharing. It an interesting idea that needs to be carefully thought out and analysed to ensure it produces a positive and constructive result for all involved.

Suzanne read: Social media as a student response system: new evidence on learning impact

This paper begins with the assertion that social media is potentially a “powerful tool in higher education” due to its ubiquitous nature in today’s society. However, also recognising that to date the role of social media in education has been a difficult one to pin down. There have been studies showing that it can both enhance learning and teaching and be a severe distraction for students in the classroom.

The study sets out to answer these two questions:

  • What encourages students to actively utilise social media in their learning process?
  • What pedagogical advantages are offered by social media in enhancing students’ learning experiences?

To look at these questions, the researchers used Twitter in a lecture-based setting with 150 accounting undergraduates at an Australian university. In the lectures, Twitter could be used in two ways: as a ‘backchannel’ during the lecture, and as a quiz tool. As a quiz tool, the students used a specific hashtag to Tweet their answers to questions posed by the lecturer in regular intervals during the session, related to the content that had just been covered. These lectures were also recorded, and a proportion of the students only watched the recorded lecture as they were unable to attend in person. Twitter was used for two main reasons. First, the researchers assumed that many students would already be familiar and comfortable with Twitter. Secondly, using Twitter wouldn’t need any additional tools, such as clickers, or software (assuming that students already had it on their devices).

Relatively early on, several drawbacks to using Twitter were noted. There was an immediate (and perhaps not surprising?) tension between the students and lecturers public and private personas on Twitter. Some students weren’t comfortable Tweeting from their own personal accounts, and the researchers actually recommended that lecturers made new accounts to keep their ‘teaching’ life separate from their private lives. There was also a concern about the unpredictability of tapping into students social media, in that the lecturer had no control over what the students wrote, in such a public setting. It also turned out (again, perhaps not unsurprisingly?) that not all students liked or used Twitter, and some were quite against it. Finally, it was noted that once students were on Twitter, it was extremely easy for them to get distracted.

In short, the main findings were that the students on the whole liked and used Twitter for the quiz breaks during the lecture. Students self-reported being more focused, and that the quiz breaks made the lecture more active and helped with their learning as they could check their understanding as they went. This was true for students who actively used Twitter in the lecture, those who didn’t use Twitter but were still in the lecture in person, and those who watched the online recording only. During the study, very few students used Twitter as a backchannel tool, instead preferring to ask questions by raising a hand, or in breaks or after the lecture.

Overall, I feel that this supports the idea that active learning in lectures is enhanced when students are able to interact with the material presented and the lecturer. Breaking up content and allowing students to check their understanding is a well-known and pedagogically sound approach. However, this study doesn’t really provide any benefit in using Twitter, or social media, specifically. The fact that students saw the same benefit regardless of whether they used Twitter to participate, or were just watching the recording (where they paused the recording to answer the questions themselves before continuing to the answers), seems to back this up. In fact, in not using Twitter in any kind of ‘social’ way, and trying to hive off a private space for lecturers and students to interact in such a public environment seems to be missing the point of social media altogether. For me, the initial research questions therefore remain unanswered!

Suzi read Getting things done in large organisations

I ended up with a lot to say about this so I’ve put it in a separate blog post: What can an ed techie learn from the US civil service?. Key points for me were:

  • “Influence without authority as a job description”
  • Having more of a personal agenda, and preparing for what I would say if I got 15 minutes with the VC.
  • Various pieces of good advice for working effectively with other people.

Chrysanthi read Gamification in e-mental health: Development of a digital intervention addressing severe mental illness and metabolic syndrome (2017). This paper talks about the design of a gamified mobile app that aims to help people with severe chronic mental illness in combination with metabolic syndrome. While the target group is quite niche, I love the fact that gamification is used in a context that considers the complexity of the wellbeing domain and the interaction between mental and physical wellbeing. The resulting application, MetaMood, is essentially the digital version of an existing 8-week long paper-based program with the addition of game elements. The gamification aims to increase participation, motivation and engagement with the intervention. It is designed to be used as part of a blended care approach, combined with face to face consultations. The game elements include a storyline, a helpful character, achievements, coins and a chat room, for the social element. Gamification techniques (tutorial, quest, action) were mapped to traditional techniques (lesson, task, question) to create the app.

The specific needs of the target group needed the contributions of an interdisciplinary team, as well as relevant game features; eg the chat room includes not only profanity filter, but also automatic intervention when keywords like suicide are used (informing the player of various resources available to help in these cases). Scenarios, situations and names were evaluated for their potential to trigger patients, and changes were made accordingly; eg the religious sounding name of a village was changed, as it could have triggered delusions.

The 4 clinicians that reviewed the app said it can proceed to clinical trial with no requirement for further revision. Most would recommend it to at least some of their clients. The content was viewed as acceptable and targeted by most, the app interesting, fun & easy to use. I wish there had been results of the clinical trial, but it looks like this is the next step.

Roger read “Analytics for learning design: A layered framework and tools”, an article from the British Journal of Educational Technology.

This paper explores the role analytics can play in supporting learning design. The authors propose a framework called the “Analytics layers for learning design (AL4LD)”, which has three layers: learner, design and community analytics.

Types of learner metrics include engagement, progression and student satisfaction while experiencing a learning design. Examples of data sources are VLEs or other digital learning environments, student information systems, sensor based information collected from physical spaces, and “Institutional student information and evaluation (assessment and satisfaction) systems”. The article doesn’t go into detail about the latter, for example to explore and address the generic nature of many evaluations eg NSS, which is unlikely to provide meaningful data about impact of specific learning designs.

Design metrics capture design decisions prior to the implementation of the design. Examples of data include learning outcomes, activities and tools used to support these. The article emphasises that “Data collection in this layer is greatly simplified when the design tools are software systems”. I would go further and suggest that it is pretty much impossible to collect this data without such a system, not least as it requires practitioners to be explicit about these decisions, which otherwise often remain hidden.

Community metrics are around “patterns of design activity within a community of teachers and related stakeholders”, which could be within or across institutions. Examples of data include types of learning design tools used and popular designs in certain contexts. These may be shared in virtual or physical spaces to raise awareness and encourage reflection.

The layers inter-connect eg learning analytics could contribute to community analytics by providing evidence for the effectiveness of a design. The article goes on to describe four examples. I was particularly interested in the third third one which describes the “experimental Educational Design Studio” from the University of Technology Sydney. It is a physical space where teachers can go to explore and make designs, ie it also addresses the community analytics layer in a shared physical space.

This was an interesting read, but in general I think the main challenge is collection of data in the design and community aspects. For example Diana Laurillard has been working on systems to do this for many years, but there seems to have been little traction. eg The learning design support environment  and the Pedagogical Patterns Collector.

Amy read: Addressing cheating in e-assessment using student authentication and authorship checking systems: teachers’ perspectives. Student authentication and authorship systems are becoming increasingly well-used in universities across the world, with many believing that cheating is on the rise across a range of assessments. This paper looks at two universities (University A in Turkey and University B in Bulgaria) who have implemented the TeSLA system (an Adaptive Trust-based eAssessment System for Learning). The paper doesn’t review the effectiveness of the TeSLA system, but rather the views of the teachers on whether the system will affect the amount of cheating taking place.

The research’s main aim is to explore the basic rationale for the use of student authentication and authorship systems, and within that, to look at four specific issues:

  1. How concerned are teaching about the issue of cheating and plagiarism in their courses?
  2. What cheating and plagiarism have teachers observed?
  3. If eAssessment were introduced in their courses, what impact do the teaching think it might have on cheating and plagiarism?
  4. How do teachers view the possible use of student authentication and authorship checking systems, and how well would such systems fit with their present and potential future assessment practises?

Data was collected across three different teaching environments: face-to-face teaching, distance learning and blended learning. Data was collected via questionnaires and interviews with staff and students.

The findings, for the most part, were not hugely surprising: the main type of cheating that took place at both universities was plagiarism, followed by ghost-writing (or the use of ‘essay mills’). These were the most common methods of cheating in both exam rooms and online. The difference between the reasons staff believed students cheated and why students cheated varied widely too. Both teachers and students believed that:

  • Students wanted to get higher grades
  • The internet encourages cheating and plagiarism, and makes it easy to do so
  • There would not be any serious consequences if cheating and plagiarism was discovered

However, teachers also believed that students were lazy and wanted to take the easy way out, whereas students blamed pressure from their parents and the fact they had jobs as well as studying for reasons.

Overall, staff were concerned with cheating, and believed it was a widespread and serious problem. The most common and widespread problem was plagiarism and ghost writing, followed by copying and communicating with others during assessments. When asked about ways of preventing cheating and plagiarism, teaching were most likely to recommend changes to educational approaches, followed by assessment design, technology and sanctions. Teachers across the different teaching environments (face-to-face, blended and distance learning) were all concerned with the increase in cheating that may take place with online/ eAssessments. This was especially the case for staff who taught on distance learning courses, where students currently take an exam under strict conditions. Finally, all staff believed that the use of student authentication and authorship tools enabled greater flexibility in access for those who found it difficult to travel, as well as in forms of assessment. However, staff believed that cheating could still take place regardless of these systems, but that technology could be used in conjunction with other tools and methods to reduce cheating in online assessments.

What can an ed techie learn from the US civil service?

I recently read Getting things done in large organisations by Thomas Kalil (profession: “expert” according to Google). Kalil worked for the Clinton and Obama administrations on science and technology policy. This is his attempt to share what worked for him. I was interested because 12 – nearly 13 – years in at Bristol and I’m still learning how to get things done. From what I understand of their structure and rate-of-change, civil service and universities are at least in some ways similar.

The paper is aimed at “policy entrepreneurs”: people who generate or spot new ideas, then evaluate and (if appropriate) help make them happen. I grew up in the 80’s and the word entrepreneur brings to mind Rodney from Only Fools and Horses … I can’t imagine wanting to apply the term to myself. But the principle certainly applies within my role, and indeed many professional roles.

Kalil starts by giving a bit of a career history, which is probably only relevant if you would like to become a policy advisor. This is the first 6 pages. The remaining 10 pages are pretty solidly filled with good advice. Here are some of the things most directly applicable to my role in digital education….

“Influence without authority as a job description” – this resonates, working in an organisation that is still often operating on goodwill and people’s desire to cooperate

Thought experiment: What if you had 15 minutes with the president (in my case the VC), and if he liked your idea he would be willing to call anyone to make it happen. Kalil developed this as a way of making people think seriously about what they would change, and who would be in a position to do it. Follow up questions:

  • Would the people called be willing & able to do it?
  • Is there anything we could change (for them or about the proposal) to make them more willing or more able?
  • What existing forums / mechanisms are there that could carry this forward? (This also relates to the paper from a previous reading group on evidence and the question: would the initiative continue if we walked away?)

There’s something empowering about having an answer to this prepared. I don’t, but I will.

How does your remit fit into the bigger picture? Related to the thought experiment above is the importance of keeping aware of – and actively looking for – areas where digital education can further the broader aims of the university.

Partnership working (working collaboratively) works best when you have good relationships. Both sides need to:

  • Understand each other’s priorities
  • Trust each other to follow-through
  • Feel able to disagree and raise concerns

Relationships need to be a two-way street, not just one side dictating. It’s also important to understand the internal politics and personalities you are working with. Clearly the bulk of this is good advice for all relationships, professional and personal.

Have an agenda. In my experience teams do tend to do this but, for personal job satisfaction at the very least, having a personal agenda makes some sense. Kalil has some good questions to ask on this (go read them) but key for me is: why do I believe this is the right thing to do and that it will work? Also, recognise that you won’t know the answers without listening to (and asking interesting questions of) other people.

Make it easy for other people to help you. The example Kalil gives is: if you want someone to send an email, write it for them. Closely related to this is his advice for making follow-up more likely to happen: ask people if they think they can complete their assignment; document and follow up commitments; have deadlines, even if they’re artificial; if someone isn’t following up, try to find out why.

Understand what tools are available to you. What are the things you / your team / the organisation can do to affect change?

Be open to ideas from a range of sources. Engage with people from outside of your own contexts. Adapt and imitate what works elsewhere.

Plan for a change in administration (surprisingly applicable in universities, this one).

And some don’ts:

  • Don’t try to do too many things at once
  • Don’t act of the urgent and forget the important
  • Don’t spend too much time on reports
  • Don’t let things drag on indefinitely
  • Don’t surprise people, they don’t like it

Nothing earth-shattering perhaps but good solid advice, much of which it’s worth being reminded of. I’d recommend it.

Wellbeing – notes from the reading group

Roger read: Curriculum design for wellbeing.

This is part of an online professional development course for academics, produced by a project run by a number of Australian Universities co-ordinated by the University of Melbourne. It aims  to build the capacity of University educators to design curriculum and create teaching and learning environments that enhance student mental wellbeing. There are 5 modules: on student wellbeing, curriculum design, teaching strategies, difficult conversations and your wellbeing.

I focussed on module 2 which is on curriculum design. It starts by stressing the importance of students, through the curriculum, experiencing autonomous motivation, a sense of belonging, positive relationships,  feelings of autonomy and competence (M-BRAC). All of these are aspects of good practice in curriculum design.

It goes on to consider how elements of curriculum design support student mental wellbeing, covering alignment, organisation and sequencing of content, engaging learning activities and a focus on assessment for learning.

For example, aligning ILOS with assessment and learning activities helps student autonomy as they understand how what they are doing contributes to their goals and they develop their knowledge and skills, including self-regulation, to achieve the ILOS. Assessment for learning plays a key role here. Clear organisation and sequencing of content contribute to effective learning. Both alignment and structure help to build students’ sense of competence.  Engaging and meaningful learning activities can increase student motivation and encourage peer interaction, which can contribute to building relationships and a sense of belonging.

It suggests that when reviewing the curriculum one should ask:

  • How will the curriculum be experienced by my (diverse) students eg international students, mature students, “first in family” students?
  • Will the curriculum foster or thwart experiences of M-BRAC

The module then has a number of FAQs you are asked to consider, with suggested answers. These were really useful as they tease out some of the complexities, for example “Is setting group assignments in the first year a good way of helping students develop positive peer relationships, and feel a sense of connectedness or belonging?”  Here they recognise that if not well-designed or if students are not supported to develop group work skills it can have a negative impact.

The module ends with a set of case studies illustrating how curricula have been re-designed to better support M-BRAC.

Amy read: Approach your laptop mindfully to avoid digital overload.

This was a short article that recognised the ever-increasing belief that we are being constantly bombarded with masses of new information which, in turn, means that many are suffering with stress-related diseases, anxiety and depression. The reliance on digital devices to provide constant streams of information in the form of news articles, social media feeds and messages mean that without these devices we feel lost without them. A full digital detox is suggested at the beginning of this article, though this may be a short-term solution and often an impractical one.

The authors of this article suggest introducing the practice of mindfulness into our lives to combat this. They describe mindfulness as ‘a moment-to-moment attention to present experience with a stance of open curiosity’. Mindfulness has been studied extensively by the medical community and has shown to help with stress, anxiety and depression in individuals. One can ‘reprogramme’ their mind to deal with stresses more easily by training it to be more present. The authors suggest two key ways they suggest to introduce mindfulness into our use of digital devices to reduce the pressures they can put on us.

One of the methods they suggest is ‘mindful emailing’, which includes practices such as taking three breaths before responding to a stressful email and also considering the psychological effect that the email will have to the recipients.

The second method they suggest is the mindful use of social media, citing ‘checking our intentions before uploading a feed (post?), being authentic in our communications and  choosing the time we spend on social media, rather than falling into it.’

If you haven’t tried mindfulness before take a look at these tips and short audio meditations.

Chrysanthi read: Designing a product with mental health issues in mind,

This article – true to its title – talks about including technological features that aim to help the vulnerable users. While the examples given are taken from a banking application context, the suggestions can be applied to other contexts. More specifically, the article mentions positive friction and pre-emptive action. Positive friction goes against developers’ usual desire to make everything easier and faster and aims to put some necessary obstacles in the way instead, for users that need it. The example used is allowing certain users with somewhat “impulsive” behaviour to check their recent purchases and confirm that they indeed want them. This would help eg bipolar disorder sufferers, who overspend in their manic phase, often at night, and slip into depression in the morning because of their irreversible mistake. In the specific app, this is still a speculative feature.  Pre-emptive action aims to prevent trouble by anticipating certain events and acting on them, eg perceiving a halt in income and sending a well timed notification to start a conversation so the person doesn’t end in debt (and therefore create more stress for themselves). Also, allowing vulnerable customers to choose their preferred time and form of communication (eg phone might be anxiety inducing or email might seem complex).

In an education context, positive friction could be relevant in cases where students are repeatedly doing things they no longer need to do. This would help when – under the illusion they are still learning – students are focusing on redoing exercises they already know how to do – which might help them feel accomplished but doesn’t add value from some point on – or on consuming more and more content, even when they haven’t actually digested what they have learned so far. It isn’t very clear how this could be applied during exam period, though. Pre-emptive action is perhaps easier to translate in an educational context. Any action (or inaction) that is either outside the student’s usual pattern or outside a successful pattern, might be a conversation starting point, or a trigger for suggestions for alternative ways to handle their studies. Also, allowing them different options to learn and communicate with their professors and peers.

Chrysanthi also read: E-mail at work: A cause for concern? The implications of the new communication technologies for health, wellbeing and productivity at work.

This paper explores the potential negative implications of using email at work. The email features they consider as potentially problematic are: speed, multiple addressability, recordability, processing & routing. Essentially, email as a message that can be instantly transmitted to several people at once, automatically stored, easily altered and different versions of it sent to various people, not all of which necessarily visible on the recipient list. According to the authors, emails may increase stress by increasing workload and interruptions, adding difficulty to interpretation of the message and tone, increasing misunderstandings and groupthink, reducing helpful argumentation and social support, increasing isolation and surveillance – which increases discontent, offering a new ground for bullying and harassment, or hindering the processing of negative feelings. Having established these potential negative implications, the authors point out that more research is needed to understand the optimal ways to use email at work, for effective communication and humane workplaces.

Naomi read: Did a five-day camp without digital devices really boost children’s interpersonal skills?

This article was about a brief study led by Yalda Uhls in Southern California. It studied two groups of pupils ‘who on average spent 4.5 hours a day ‘texting watching TV, and video gaming’. Half of the children were sent on a five-day educational camp in the country side where all technical devices were banned. The other half stayed at school as usual. Emotional and psychological tests were carried out on the students before and after the 5 days were completed.

There was a small amount of evidence to suggest the children who had spent time away from devices improved psychologically over the five days. However, because there were several small problems with the study no firm answers can be taken from it. Its suggested the children who went away for the five days only looked like they improved on the tests because they started at a lower level then the children who stayed at school. It was also suggested that the children who stayed at school tests deteriorated because they were tired from doing a week’s work.

As it suggests in the article, the results of this study weren’t particularly hard hitting but it does raise the question of how much the younger generations are using their devices throughout the day. Uhls does admit in the article that there were shortcomings to the study, but they suggest that these findings relate to the ‘wider context of technology fears’ and hope their paper will be ‘a call to action for research that thoroughly and systematically examines the effects of digital media on children’s social development.’ Although the study needed a more comprehensive approach the ideas behind it are interesting and relate to several issues that we see in everyday life – is it good for us to spend so much time on our devices, or is it integral to how we live now?

Suzi read: Learning in the age of digital distraction

This interview with Adam Gazzaley, a neurologist, is a plug for a book called The Distracted Mind in which he and and research psychologist Larry D Rosen talk about the way our brains are wired influences how we use technology.

They suggest that information-foraging is a development of our evolved food-foraging mechanisms, and so is to some extent driven by our very basic drive to survive. Because of this it is hard to prevent it from distracting us from our ability to set and pursue higher-level goals.

Information-foraging doesn’t just impact on people’s ability to focus, it can cause anxiety and stress, and affect mood.

Suggestions for possible ways to combat this include:

  • accepting that we need to (re-)learn to focus (they are also developing brain-training video games)
  • using play in education (but this was only very briefly mentioned I wasn’t clear if this was playfulness or gamification)
  • physical exercise
  • mindfulness
  • sitting down to dinner as a family, or otherwise building in device-free interaction time

Suggested reading

UCiSA – Beyond Lecture Capture event

On the 14th June Neil Davey – Teaching, Learning and Collaboration Spaces Team Manager, and I attended the UCiSA Beyond Lecture Capture event. This event focused on how lecture recording has impacted both student learning and enhanced their experience.

Session topics include:

  • Research on the student learning experience with lecture capture
  • Student feedback panel session
  • Analysis on usage of lecture recordings compared to grades
  • Moving from the traditional lecture to the flipped

Many of the talks expanded on what we have seen at Bristol and the supporting research –

  • Students love lecture capture
  • They use it primarily for revision and enhancing their notes
  • Audio quality is key
  • Good Data is paramount – students do not like lectures with no point of reference in the title
  • Incomplete coverage of rooms is frustrating for them
  • Impact on attendance is a concern of academics
  • Induction for students is needed at a point they are most receptive – ideally contextualised by academics rather than delivered in the abstract

I did hear a couple of things that surprised me, for example both the University of Sheffield and York had high percentages of students that watch the recordings all the way through circa 40%. How do we test what we think we know and what questions should we be asking of the data both quantitative and qualitative we have already gathered to see if our assumptions are correct.

While not an exhaustive list –

How many of our students watch the whole recording?

How are closed caption units used – does this differ from other recordings?

Is there a positive impact on student well being e.g. reduced anxiety when lectures are recorded?

How do we quantify any affect on attainment?

 

 

Playful learning – notes from the reading group

Suzi read Playful learning: tools, techniques, and tactics by Nicola Whitton

This is a useful scene-setting article, suggesting ways of framing discussions on playful learning and pointing the way to unexplored territory suitable for future research.

There are three ideas about how to talk about playful learning:

  • The magic circle – a socially constructed space in which play can happen
  • A mapping of aspects of games onto playful learning: surface structures of playful learning <> mechanics of games, deep structures <> activities of play, implicit structure <> philosophy of playfulness
  • Tools / techniques / tactics of playful learning: objects artefacts & technologies / approaches / mechanics and attributes – these could serve as prompts for getting playfulness into teaching

Whitton suggests three characteristics of the magic circle which make it pedagogically useful: “the positive construction of failure; support for learners to immerse themselves in the spirit of play; and the development of intrinsic motivation to engage with learning activities.” In playful activities, failure will be framed positively, participants suspend disbelief which can encourage creativity, and participation is voluntary so there is intrinsic motivation (the difficulty of this last in particular in a formal education setting is acknowledged).

There’s a lot of acknowledgement that playfulness may not be an easy fit in higher education. Obstacles include: the inescapably of real world power relationships, confusing gamification with true playfulness, the need for things to be mandatory and assessed, existing attitudes to failure, prejudice about play being for children, lack of time, confidence and social capital.

I wasn’t certain about the point about play being a privilege. While certain types of play might attract a relatively slender demographic (escape rooms, real world games) and it’s important not to assume that everyone would want to engage in these, adults seem to play to learn in a range of contexts. I thought about the kinds of spaces where you would see playful learning: cooking, Karaoke, parkour, getting dressed up, new social media platforms (when FB started and everyone was poking each other and and biting each other and throwing bananas, hashtags came from playing with the way Twitter worked), and of other adult pursuits. There is playfulness in higher education too, although it’s often not explicitly described as such. Maybe there is a danger of rarefying play and almost making it by definition a domain for geeks alone, not recognising play that has not been made explicit.

This got me thinking about why we play, and why we might want to play in HE, and about one of my favourite quotes:

“The things we want are transformative, and we don’t know or only think we know what is on the other side of that transformation. Love, wisdom, grace, inspiration – how do you go about finding these things that are in some ways about extending the boundaries of the self into unknown territory, about becoming someone else?”

— Rebecca Solnit, A field guide to getting lost

This sums up a lot of what university could and should be. Playfulness has to have a very key role in that: place to play with possible selves, both academic and social.

Chrysanthi read Gamifying education: what is known, what is believed and what remains uncertain: a critical review by Christo Dichev and Darina Dicheva.

This is a review aiming to find what is known about gamification in educational contexts based on empirical evidence, rather than beliefs. The authors seem to find that much more is believed or uncertain, rather than known. For example, their main findings are that a. gamification has started being used at a pace much faster than researchers pace at understanding how it works, b. there is very little knowledge about how to effectively apply it in specific contexts and c. there is not enough evidence about its benefit long term.

While the understanding of how to engage, motivate and aid learning through gamification is inadequate, researchers are still praising the practice, thus inflating expectations about its effectiveness. The frequent use of performance-centric game elements like points, levels, badges and leaderboards is noteworthy; in absence of justification from the researchers implementing them, the authors hypothesize that this happens because they are similar to traditional classroom practices and easy to implement. But this leaves other major game elements out; authors note – among others – role play, narrative, choice, low risk failure. These tiny elements are then expected to affect broad concepts like motivation, with researchers often concluding that that is the case, without enough evidence to claim it is so.

This implies a somewhat blind application of the easiest-to-implement elements of gamification, with the belief that it will be enough to motivate students to perform better. But how are points different to marks and levels different to grades and chapters?

Perhaps gamification can’t be a canned, one-size-fits-all-learning-contexts solutions. Perhaps researchers and practitioners need to put in time and at least a bare minimum of imagination to create something that will be engaging enough for students, for the evidence supporting it to not be stamped “inconclusive” when under scrutiny.

David read Playful learning in Higher Education: developing a signature pedagogy by Rikke Toft Nørgård, Claus Toft-Nielsen & Nicola Whitton (2017)

This paper starts off with a bit of a rant about the commercialisation of higher education and the focus on metrics to measure performance and how this creates an assessment-driven environment focused on goal-oriented behaviours characterised by avoidance of risk and fear of failure. The authors see recent gameful approaches in higher education as a response to this but warn that while gamification may increase motivation, games often focus on extrinsic motivational drivers and the results may be short-lived. They also cite research which points to issues around perceived appropriateness and students manipulating points-based incentive systems (and my colleagues and I have encountered examples of this in out teaching).

In contrast to gamification, they regard playful learning as something which encourages intrinsic and longer-term motivation by offering the chance to explore and experiment without fear of being judged for failure and therefore being able to learn from it. They use the ideas of the ‘magic circle’ and ‘lusory attitude’ to describe the environment in which this can occur. The concept of the magic circle is used a lot in gaming and is a metaphor for the ‘space’ we enter into when we fully engage with a game, accepting the different norms and codes of practice (or actively constructing them with other ‘players’). This can refer to physical (e.g. sports) or virtual/imaginary (e.g. computer games) or a combination of both (e.g. a child’s tea party). For this to work, we need to assume an ‘lusory attitude’. This gives participants a shared mindset in which they are free to play, experiment and fail in a safe place.

The Magic Circle – How Games Transport Us to New Worlds

The authors then turn to the question of how to implement such an approach. Using the results of two studies about what students report (a) makes their learning enjoyable and (b) disengages them, they develop a ‘signature pedagogy’ for playful learning in higher education. The notion of ‘signature pedagogy’ they assume is split into three levels:

  • The foundation is formed by Implicit (playful) structures, which are the necessary assumptions and attitudes (values, habits, ethics)
    • Lusory attitude
    • Democratic values and openness
    • Acceptance of risk-taking and failure
    • Intrinsic motivation
  • Deep (play) structures represent the nature of the activities which the implicit structures facilitate
    • Active and physical engagement
    • Collaboration with diverse others
    • Imagining possibilities
    • Novelty and surprise
  • Surface (game) structures are the ‘mechanics’ of an activity, including the materials, tools and actions involved
    • Ease of entry and explicit progression
    • Appropriate and flexible levels of challenge
    • Engaging game mechanics
    • Physical or digital artefacts

The authors see the implicit (playful) structures as the necessary starting point for their ‘signature pedagogy’ but do not say how students get to this point. Indeed they acknowledge the inherent paradox in their model:

“…for many students to view learning as valuable then it must be valued by the system (assessed), yet it is simultaneously this assessment that makes learning stressful and undermines the creation of a safe and comfortable environment.”

For me, then, this article leaves three interrelated questions to be discussed:

  1. For playful learning to be successful, do students need to have the implicit structures already in place or can students acquire these through the playful activity itself?
  2. If these implicit structures are prerequisite, how do we get students to acquire them?
  3. As this involves a change in students’ attitudes which the authors argue are reinforced by the current assessment-driven environment, does this pedagogical approach have any chance of success without change at the programme or institutional level?

Suzanne read Unhappy families: using tabletop games as a technology to understand play in education by John Lean, Sam Illingworth, Paul Wake, published in the ALT Journal special issue

In this article, the authors decide to take a step back when considering the ‘future’ of digital technologies in relation to playful learning by considering traditional table top games as a form of technology. They aimed to better understand the affordances of digital game tools by looking at table top games as an analogue, in order to reflect critically on the pedagogical uses of games and playful learning. Their hypothesis was that table top games (see the article for a full definition of how they classify a game as ‘table top’), are successful because: 1) they provide an immediate and accessible shared space, which is also social; 2) this space and the game are both easily modified by players and educators; and 3) they provide a tactile, sensory experience. So, in essence, that they are social, modifiable and tactile, which are all things that could be transferred into digital games in education, but which are often overlooked.

To explore this hypothesis, they used a specific game, that of  ‘Gloom‘, which was played by participants at the 2017 ALT Playful Learning Conference. In relation to the first hypothesis, they found that the game encouraged a lot of social interaction. Firstly, the game encourages players to talk about their recent lived experiences as a means of deciding who gets to go first (ie, who has had the worst day thus far). Additionally, there is a storytelling element of the game, which also encourages an empathetic interaction between the game and the players, as well as between players.

Regarding how modifiable the game is, the players found it was easy to change and adapt the game, even during play. They also explored the ways of playing around with and stretching the rules, to create different rules or games within the game play. The authors note that this is often not as easily achieved in digital games, where rules can be more fixed and more difficult to circumvent. Thirdly, the players did undoubtedly find the game tactile, as the cards provide a physical element, further enhanced by the way the cards are played. The cards themselves have transparent elements, so as you stack cards you create different versions of them, allowing for the storytelling element.

In conclusion, the authors used this game play experience to revisit some preconceptions about what ‘play’ or ‘playfulness’ is in a game context. They felt that the ‘true’ play seemed to happen when the players had modified the rules to the point where the game itself was almost no longer required. The players were exploring and testing their new game playfully, in the way that they were interacting with each other and the environment. In terms of education, they felt that this playfulness had great potential for learning. The process of negotiating the play, and working out how to play with others who might have different ideas to you (for example, either wanting to stick to the rules or wanting to break them) is potentially a powerful social learning opportunity.

However, they also noted that this very character of playful learning – that it is negotiated and created by the context and participants – makes it extremely difficult to categorise or understand pedagogically. If we need to allow for such variety of outcomes in playful learning, it can be difficult to work out how we can situate it within other educational structures, like lesson plans or learning objectives.

Suggested reading

OER18 – some of my favourite ideas from the conference

A couple of weeks ago I went along to OER18. There was a lot to like about the event and so much I’d have loved to hear more about. Here are some of my favourite ideas from the talks I attended…

Helping staff understand copyright for reuse

Glasgow Caledonian found that understanding copyright was a barrier to their staff reusing content so made a quick, self-service copyright advisor. It’s very easy to use and has a traffic light system to indicate whether you can go ahead, need to investigate further, or can’t use the resource. The advice is cc-by licensed so could easily be repurposed, and they are currently developing an HTML5 version.

Approaches to institutional repositories

Southampton have developed EdShare for managing and hosting open content, with EdShare Hub now being developed to bring together content from the institutions using EdShare. It has been integrated into their systems and processes with their comms and marketing team use EdShare behind their iTunesU and their medical school having MedShare. For further information see this presentation on EdShare from the ALT 2017 Winter Conference.

Edinburgh have an OER policy but they don’t have an institutional repository. Resources are shared on whichever online platform is most appropriate. They have accounts on Vimeo, Flickr, and similar services and through this approach hope to encourage true openness and adaptability. They also have a media asset management platform called Media Hopper.

Teaching API’s through Google Sheets

Martin Hawksey ran a good session, introducing the basics of APIs using a practical Google Sheets / Flickr exercise. Martin’s slides and the associated worksheet are available for reuse (cc-by).

Microlearning: TEL cards

Daniel Hardy and Matthew Street from Keel showed us the cards they had produced to promote various practices to staff. These sit within the VLE. The TEL cards code is available on GitHub.

Telcards

Provocations

In the Breaking Open session, we were given a series of provocations relating to who is excluded from or disadvantaged by open education practices. I like the way we (in groups of 6 or so) were asked to interact with these provocations:

  • Choose one of the statements to work with
  • What is the worst case, the worst things that could happen
  • What could you do to make that worst case happen?
  • What are you doing that might be contributing to the worst case?

The session worked well, although on my table at least there seems some defensiveness and a fixed idea that: open = good. I appreciated having contributors videoconference in and form their own virtual workshop table for the activity. Further information including the provocations are on the Towards Openness site.

Lightning keynotes

The final keynote was left open and people were invited to, during the event, come forward if they would like to give a 5 minute reflection during this session. Honestly I was a little sceptical about how this would work but it was fantastic. I was particularly pleased to see two of the people whose earlier sessions I had found most interesting, Taskeen Adam and Prittee Auckloo, giving their take on what they had seen.

Inspiring student projects

Addressing shortage of materials / perspectives through OER

Lorna Campbell, in her keynote, mentioned an Edinburgh project addressing lack of materials around LGBT+ healthcare, with students adapting existing materials.

Welsh Wikipedia content

Jason Evans, National Wikipedian at the National Library of Wales, works with university and school students to help them write and contribute to Welsh-language wikipedia. Basque universities have used a similar model with their students.

Moving witch trials data to Wikidata

Ewan McAndrew from Edinburgh talked about working with MSc Data Design students to move an existing Access database of information about witchcraft trials onto Wikidata to make it available to researchers. Students also produced videos using the data.

Geoscience Outreach course

Stephanie (Charlie) Farley from Edinburgh talked about a course within Geoscience on co-creation of OERs. Students are paired up with community organisations, schools, etc and work to produce a piece of science communication or educational resource for that group. Students have produced events and apps and board games, as well as video and learning materials. The university hires student interns over the summer who work with selected students to polish their projects and promoted them as OERs.

Thoughts from a recent GW4 meeting at University of Bath

 

On Friday 23rd March, Mike, Naomi, Robyn, Han and I headed over to Bath for the latest GW4 meeting of minds. As decided in the previous meeting, the main topics for discussion were e-assessment and portfolios, but we also discussed MOOC development and learning analytics. Unfortunately, no one from Exeter could make it up this time, so it was us from Bristol, along with colleagues from Bath and Cardiff. As before, we used Padlets to pool ideas and discussion points as we discussed in smaller groups.

Portfolios 

Portfolios seem to be a common focus (dare I even say, headache). Bath and Cardiff have been using Mahara, and have been trying to overcome some of its limitations in-house. There was a strong feeling that none of us have found a portfolio which delivers what we need, and that if we ganged up on the providers they might be able to find a solution. The next step is to try to define what it is we do need from a portfolio, which tools we use (or have already investigated), and what we can do to find a common solution. Some immediate themes were e-portfolios as assessment tools (and how they integrate with current systems), GDPR implications, students being able to share parts of portfolios externally and internally, and how long students can have access to their portfolio.

MOOCS

As something we all have experience of, to a greater or lesser degree, there was inevitably quite a bit of discussion around MOOCs. We talked about the processes we follow to develop MOOCs, and the different support we provide to academics. For example, Gavin from Bath showed us how he uses Camtasia to produce videos in house; in fact, he was able to knock up an example of such a video in 20 minutes during the session, with mini interviews and shots from the day. We also discussed the data we get from FutureLearn, and how we all find it difficult to do anything with that data. With so much information, and not much time, it tends to become something we’d all like to do more with but never quite find the time for.

The discussion also retuned to an idea we’ve been kicking around GQ4 for a while, that of a collaborative MOOC. We discussed the idea of perhaps making courses for pre-entry undergrads, or students embarking on PhDs, or perhaps staff development and CPD courses for new academics (which Cardiff are already building a bank of in FutureLearn). The idea of creating small modular courses or MOOCS, where each of us could provide a section which is based on our own expertise and interests, was also popular…let’s see how this develops!

E-assessment

Tools and systems around e-assessment was also a common theme. As well as thinking about Blackboard assignments, use of Turnitin and QMP, there was also talk about peer assessment tools and practice and adopting a BYOD approach. It seemed that we all had experiences of e-assessment being very mixed, with huge disparity in adoption and approach within our institutions. We’re all working on e-assessment, it seems, for example our EMA project, which is quite similar to that of Bath. However, other trials are also going ahead, such as Cardiff’s trial of ‘Inspera‘. I think we’re all keen to see what their experiences of that project are, as the Scandinavian approach to e-exams has often been heralded as the future!

What next?

For the future, we discussed more of a ‘show and tell’ approach, where we could get a closer look at some of the things we’re up to. There was also talk of upping our use of communication channels in between meeting in person, particularly using the Yammer group more frequently, and perhaps having smaller virtual meetings for specific topics.

It wasn’t decided who would host the next session, particularly as Exeter weren’t represented, although we did tentatively offer to host here at Bristol. But, seeing as Bath really did set the bar high for lunch expectations – with special mention to the excellent pies and homemade cake – if we do host I think we’d better start planning the food already…!

 

 

Reflections on the ABC mini-conference from Suzanne

Heading to London for the ABC mini event on Friday 9 February at UCL, I was a tiny bit apprehensive. This curriculum development tool was something I have used, in various forms, but without ever actually seeing how it should be ‘properly’ done, or ever receiving any training from Clive and Natasha, who came up with it. What I soon found was that our renegade use of the tool wasn’t in fact that renegade.

The morning session, where I got to actually try to develop a course using the tool, was pretty reassuring. It turns out I had actually been running the sessions ‘properly’ after all, which I would say is testament to how straightforward and logical the tools are to use.

After being on the other side of the table during a session, I learned how enjoyable it is to make such visible progress in such a short time. I also realised how much you have to remember if you end up talking through a whole sequence of learning without noting down the detail (ie, before you ‘flip the cards’). By the time we came to adding detail, we all had to try and remember what we’d had in mind. This is definitely something I’ll bear in mind the next time I run a session.

 

 

As well as the hands on session, hearing about what others have been using the method for, and what they had learned from it, was inspiring. The main things that stuck in my mind were:

  • How useful the method is as a review tool (as I had previously used it to design new courses). It helps people visualise and recognise all the great things they already do, before thinking about how they might want to develop their course for the future. The act of discussing it with others surfaces long held beliefs and assumptions which might no longer apply. When redesigning a course, unit or programme, I can see how helpful this might be.
  • Secondly, this tool is really effective at a programme level. The evaluation of individual courses or units seems to take on a new dimension when done in a room with all the units and courses in the programme being evaluated at the same time. Without asking people to do this explicitly, connections between units can be spotted and developed, duplication can be discussed, and people involved across the whole programme can start to get a real sense of what the students’ experience of the whole programme actually is. A ‘ground-up’ programme development seems to happen, which is more holistic and sustainable than a ‘top-down’ directive.

For our purposes, this certainly seems like a useful tool for two big projects that the University of Bristol is tackling: programme level assessment, and embedding the Bristol Futures themes into the core curriculum. Being able to quickly map where things already happen, and then talk about it in an open and positive environment, could be a really engaging way to get these conversations started. Let’s see where learning our ABCs can get us…

ABC mini conference – talk from Bristol

Notes from Suzanne Collins and Suzi Wells on using the ABC cards in Bristol. This talk was given at the ABC mini conference, UCL, London, 9 March 2018. See the ABC Learning Design web pages for further resources.

Suzi: Trialling ABC as a tool in workshops

I first came across the ABC curriculum design method while browsing UCL’s digital education pages looking for ideas. It immediately appealed. My background is in structuring and building websites, and I had used paper-based storyboarding in that context.

First trial: a single unit

Colleagues were enthusiastic and we started looking for contexts to trial it. An academic approached us with a view to involving us in significantly redesign a unit and we suggested the ABC approach.

As a tool for discussion, and for engaging a more diverse group of people – two academics, two learning technologists, one librarian, and someone else – it worked very well. They were very engaged and all could contribute. Although they couldn’t agree on a single tweet.

But we didn’t complete all the activities in the time. We also didn’t talk to them about how it should fit in to the overall development cycle and didn’t have much opportunity to follow up on what next. To me it felt like there was less value in talking about a single unit in isolation, that there would have been more benefit if we’d been working on a programme.

It was a useful tool and an enjoyable session but it wasn’t right yet.

Second trial: developing online courses

Not long after that we were asked to get involved in developing three online courses which would be promoted to our own students, as well as to the public more widely. Each course would be developed by a group of academics from a variety of different disciplines, many of whom had not worked together before.

The timescales were extremely short (by university standards). The academics involved were extremely busy with their existing work. These courses had to be innovative, transformative, cross-disciplinary, interlinked, approachable by anyone, essentially self-sustaining … and should encourage the development of transferable skills. No small ask.

Having pitched their ideas and been selected to lead or participate, the teams were assembled for an initial one day event. As part of this we ran several short sessions. We asked them to do an elevator pitch (they resolutely failed to follow the instructions on this). We also did a pre-mortem (imagine it’s a year down the line and these projects have been an absolute disaster, tell us what went wrong – very popular and a great way of surfacing problems and clearing the air).

We then ran an ABC workshop, with three tables myself and my colleagues Roger Gardner and Mike Cameron running a table each.

We modified the cards slightly to make them more platform-focused. We also added a time wheel to each week. Students would be expected to spend three hours a week in total on these courses and from conversations we’d had with the academics we knew that they were veering towards providing three hours of video a week (plus readings and activities). We wanted to focus attention on how students would spend their time.

We attempted to fit all this within an hour, because that was all the time available in the schedule.

For stimulating discussion, getting everyone to contribute, and shifting focus towards the student experience it worked well. The teams understood it and could work with it quickly. We were definitely over-ambitions about how much we could get through in an hour. Added to this, it was too early in the process and teams still had divergent or vague ideas about content (even on a big-picture scale) which couldn’t be resolved in a short time available.

One interesting finding was about the value of pushing people through the process. The other two facilitators used the framework and cards but took a more freeform approach, allowing discussions to run on. I was much stricter, pushing people through the activities. At the end of the day my group were the only one who asked to take the cards away and declared that they would use it themselves. Working through all the activities seemed to help people see the value of the process (though of course that may not mean that the discussion was more valuable).

Suzanne: Using ABC throughout online course design

My experiences of using the ABC method came later in the process of developing these online courses. My colleague Hannah O’Brien and I worked intensively with the three course teams, and we turned to ABC to help us do that. When we started, there were a lot of ideas, too many in fact(!), and we tried to find ways to get those ideas somehow on to paper, so that we could all evaluate them, and work them into a course design.

We ran a series of shorter, small group ABC sessions, using the modified cards from Suzi and Roger’s previous session. The courses were going to end up in the FutureLearn platform, so the course design by nature needed to work in a linear sequence of weeks of learning. In each week, we needed a series of ‘activities’, which were made up of different ‘steps’. Anyone familiar with FutureLearn can tell you that there isn’t a great deal of choice for what these steps are: a text article, a video, a discussion, a quiz, or a limited selection of external ‘exercises’.

What the ABC sessions highlighted early on for our teams was that having lots of video and articles explaining ideas might look jazzy, but is all very similar (and not very active) in terms of learning types. We all noticed there was far too much of the turquoise ‘acquisition’ happening in courses which were designed to develop skills such as communication and self-efficacy.

To help our academics come up with alternatives ideas for how students could, within the limits of FutureLearn, have a more interactive and challenging learning experience, we also created a bank of good examples, which we called our ‘Activity Bank’. As we worked to try and think of ideas for collaboration, or inquiry, for example, we could direct them to explore these examples, and adapt the ideas for their own purposes.

Overall, the ABC ended up being a useful tool to get everyone talking about the pedagogical choices they were making in a similar way. We could map the learning experience quickly and visually, so that we could prototype, evaluate and  iterate course designs. It also kept us all clearly focused on what the learners were doing during the course, rather than how amazingly we were presenting the materials.

Since then, I’ve found myself returning to the ABC tools and ideas regularly. The learning type ‘colours’ got quite embedded in our way of thinking and documenting learning designs. They cropped up in a graphic course design map created to demonstrate the pedagogical choices for the online courses (see below), and are now doing so again in a different context.

This new context, and the next big project for me is the Bristol Futures Optional Units. These are blended, scalable, credit bearing, multidisciplinary, investigative units, open to all students, around the Bristol Futures themes of Global Citizenship, Innovation and Enterprise and Sustainable Futures. So, no small ask, once again.

For this, the ABC cards have been tweaked again, this time to generate ideas for both online and face-to-face ideas for course elements, to allow for a flexible and student-choice driven learning experience. How can we provide a similar learning experience for students who might end up taking the unit in very different ways? We’re in the early days of course design, but I imagine that we’ll end up using the ABC workshops in various forms during the coming year!

In all, the ABC has become a bit of an ace up our sleeves. When we need temas to work more collaboratively, when we need the focus shifted back to the student, when we need to make progress rapidly and efficiently, even when we come to evaluate learning design – the ABC tools seem to provide us with a way to talk, act, design, and iterate.

Reflections on the ABC mini-conference from Suzi

On Friday 9 March myself and my colleague Suzanne Collins made our way to UCLs London Knowledge Lab, round the back of Lambs Conduit Street, to attend a mini-conference on the ABC curriculum design methodology developed by Clive Young and Nataša Perović.

It’s something we’ve been using an adapted version of at Bristol for just over a year, so it was great to see Clive and Nataša in action at the masterclass, and to hear about the great work being done at Glasgow, Canterbury Christ Church and Reading.

Some useful points from the day:

  • Glasgow have been using an online tool to make an electronic version (and have templates available)
  • Canterbury Christ Church have used PowerPoint to create an electronic copy while the workshop runs
  • Other coloured stars have been added to make visible: places where they engage with the education strategy; developing employability skills; other priorities (identified by the course teams)
  • Who is in the workshop is critical. Do you have students? Library staff? A critical friend?
  • It’s not just us – everybody adapts the cards (sometimes they even change the colours).

During the morning session people talked about using the cards with students, to allow them to design the course. One speaker suggested using them with evidence of BAME / gender engagement (in different types of activity), to address the way the course works for different learners. It was great to see how quickly people picked up the idea and started taking it on as their own.

Lots of potential and positivity. I look forward to seeing how the network grows.