OER18 – some of my favourite ideas from the conference

A couple of weeks ago I went along to OER18. There was a lot to like about the event and so much I’d have loved to hear more about. Here are some of my favourite ideas from the talks I attended…

Helping staff understand copyright for reuse

Glasgow Caledonian found that understanding copyright was a barrier to their staff reusing content so made a quick, self-service copyright advisor. It’s very easy to use and has a traffic light system to indicate whether you can go ahead, need to investigate further, or can’t use the resource. The advice is cc-by licensed so could easily be repurposed, and they are currently developing an HTML5 version.

Approaches to institutional repositories

Southampton have developed EdShare for managing and hosting open content, with EdShare Hub now being developed to bring together content from the institutions using EdShare. It has been integrated into their systems and processes with their comms and marketing team use EdShare behind their iTunesU and their medical school having MedShare. For further information see this presentation on EdShare from the ALT 2017 Winter Conference.

Edinburgh have an OER policy but they don’t have an institutional repository. Resources are shared on whichever online platform is most appropriate. They have accounts on Vimeo, Flickr, and similar services and through this approach hope to encourage true openness and adaptability. They also have a media asset management platform called Media Hopper.

Teaching API’s through Google Sheets

Martin Hawksey ran a good session, introducing the basics of APIs using a practical Google Sheets / Flickr exercise. Martin’s slides and the associated worksheet are available for reuse (cc-by).

Microlearning: TEL cards

Daniel Hardy and Matthew Street from Keel showed us the cards they had produced to promote various practices to staff. These sit within the VLE. The TEL cards code is available on GitHub.

Telcards

Provocations

In the Breaking Open session, we were given a series of provocations relating to who is excluded from or disadvantaged by open education practices. I like the way we (in groups of 6 or so) were asked to interact with these provocations:

  • Choose one of the statements to work with
  • What is the worst case, the worst things that could happen
  • What could you do to make that worst case happen?
  • What are you doing that might be contributing to the worst case?

The session worked well, although on my table at least there seems some defensiveness and a fixed idea that: open = good. I appreciated having contributors videoconference in and form their own virtual workshop table for the activity. Further information including the provocations are on the Towards Openness site.

Lightning keynotes

The final keynote was left open and people were invited to, during the event, come forward if they would like to give a 5 minute reflection during this session. Honestly I was a little sceptical about how this would work but it was fantastic. I was particularly pleased to see two of the people whose earlier sessions I had found most interesting, Taskeen Adam and Prittee Auckloo, giving their take on what they had seen.

Inspiring student projects

Addressing shortage of materials / perspectives through OER

Lorna Campbell, in her keynote, mentioned an Edinburgh project addressing lack of materials around LGBT+ healthcare, with students adapting existing materials.

Welsh Wikipedia content

Jason Evans, National Wikipedian at the National Library of Wales, works with university and school students to help them write and contribute to Welsh-language wikipedia. Basque universities have used a similar model with their students.

Moving witch trials data to Wikidata

Ewan McAndrew from Edinburgh talked about working with MSc Data Design students to move an existing Access database of information about witchcraft trials onto Wikidata to make it available to researchers. Students also produced videos using the data.

Geoscience Outreach course

Stephanie (Charlie) Farley from Edinburgh talked about a course within Geoscience on co-creation of OERs. Students are paired up with community organisations, schools, etc and work to produce a piece of science communication or educational resource for that group. Students have produced events and apps and board games, as well as video and learning materials. The university hires student interns over the summer who work with selected students to polish their projects and promoted them as OERs.

Thoughts from a recent GW4 meeting at University of Bath

 

On Friday 23rd March, Mike, Naomi, Robyn, Han and I headed over to Bath for the latest GW4 meeting of minds. As decided in the previous meeting, the main topics for discussion were e-assessment and portfolios, but we also discussed MOOC development and learning analytics. Unfortunately, no one from Exeter could make it up this time, so it was us from Bristol, along with colleagues from Bath and Cardiff. As before, we used Padlets to pool ideas and discussion points as we discussed in smaller groups.  

Portfolios 

Portfolios seem to be a common focus (dare I even say, headache). Bath and Cardiff have been using Mahara, and have been trying to overcome some of its limitations in-house. There was a strong feeling that none of us have found a portfolio which delivers what we need, and that if we ganged up on the providers they might be able to find a solution. The next step is to try to define what it is we do need from a portfolio, which tools we use (or have already investigated), and what we can do to find a common solution. Some immediate themes were e-portfolios as assessment tools (and how they integrate with current systems), GDPR implications, students being able to share parts of portfolios externally and internally, and how long students can have access to their portfolio.

MOOCS   

As something we all have experience of, to a greater or lesser degree, there was inevitably quite a bit of discussion around MOOCs. We talked about the processes we follow to develop MOOCs, and the different support we provide to academics. For example, Gavin from Bath showed us how he uses Camtasia to produce videos in house; in fact, he was able to knock up an example of such a video in 20 minutes during the session, with mini interviews and shots from the day. We also discussed the data we get from FutureLearn, and how we all find it difficult to do anything with that data. With so much information, and not much time, it tends to become something we’d all like to do more with but never quite find the time for. 

The discussion also retuned to an idea we’ve been kicking around GQ4 for a while, that of a collaborative MOOC. We discussed the idea of perhaps making courses for pre-entry undergrads, or students embarking on PhDs, or perhaps staff development and CPD courses for new academics (which Cardiff are already building a bank of in FutureLearn). The idea of creating small modular courses or MOOCS, where each of us could provide a section which is based on our own expertise and interests, was also popular…let’s see how this develops!

E-assessment 

Tools and systems around e-assessment was also a common theme. As well as thinking about Blackboard assignments, use of Turnitin and QMP, there was also talk about peer assessment tools and practice and adopting a BYOD approach. It seemed that we all had experiences of e-assessment being very mixed, with huge disparity in adoption and approach within our institutions. We’re all working on e-assessment, it seems, for example our EMA project, which is quite similar to that of Bath. However, other trials are also going ahead, such as Cardiff’s trial of ‘Inspera‘. I think we’re all keen to see what their experiences of that project are, as the Scandinavian approach to e-exams has often been heralded as the future! 

What next? 

For the future, we discussed more of a ‘show and tell’ approach, where we could get a closer look at some of the things we’re up to. There was also talk of upping our use of communication channels in between meeting in person, particularly using the Yammer group more frequently, and perhaps having smaller virtual meetings for specific topics.   

It wasn’t decided who would host the next session, particularly as Exeter weren’t represented, although we did tentatively offer to host here at Bristol. But, seeing as Bath really did set the bar high for lunch expectations – with special mention to the excellent pies and homemade cake – if we do host I think we’d better start planning the food already…!  

 

 

Reflections on the ABC mini-conference from Suzanne

Heading to London for the ABC mini event on Friday 9 February at UCL, I was a tiny bit apprehensive. This curriculum development tool was something I have used, in various forms, but without ever actually seeing how it should be ‘properly’ done, or ever receiving any training from Clive and Natasha, who came up with it. What I soon found was that our renegade use of the tool wasn’t in fact that renegade.

The morning session, where I got to actually try to develop a course using the tool, was pretty reassuring. It turns out I had actually been running the sessions ‘properly’ after all, which I would say is testament to how straightforward and logical the tools are to use.

After being on the other side of the table during a session, I learned how enjoyable it is to make such visible progress in such a short time. I also realised how much you have to remember if you end up talking through a whole sequence of learning without noting down the detail (ie, before you ‘flip the cards’). By the time we came to adding detail, we all had to try and remember what we’d had in mind. This is definitely something I’ll bear in mind the next time I run a session.

 

 

As well as the hands on session, hearing about what others have been using the method for, and what they had learned from it, was inspiring. The main things that stuck in my mind were:

  • How useful the method is as a review tool (as I had previously used it to design new courses). It helps people visualise and recognise all the great things they already do, before thinking about how they might want to develop their course for the future. The act of discussing it with others surfaces long held beliefs and assumptions which might no longer apply. When redesigning a course, unit or programme, I can see how helpful this might be.
  • Secondly, this tool is really effective at a programme level. The evaluation of individual courses or units seems to take on a new dimension when done in a room with all the units and courses in the programme being evaluated at the same time. Without asking people to do this explicitly, connections between units can be spotted and developed, duplication can be discussed, and people involved across the whole programme can start to get a real sense of what the students’ experience of the whole programme actually is. A ‘ground-up’ programme development seems to happen, which is more holistic and sustainable than a ‘top-down’ directive.

For our purposes, this certainly seems like a useful tool for two big projects that the University of Bristol is tackling: programme level assessment, and embedding the Bristol Futures themes into the core curriculum. Being able to quickly map where things already happen, and then talk about it in an open and positive environment, could be a really engaging way to get these conversations started. Let’s see where learning our ABCs can get us…

ABC mini conference – talk from Bristol

Notes from Suzanne Collins and Suzi Wells on using the ABC cards in Bristol. This talk was given at the ABC mini conference, UCL, London, 9 March 2018. See the ABC Learning Design web pages for further resources.

Suzi: Trialling ABC as a tool in workshops

I first came across the ABC curriculum design method while browsing UCL’s digital education pages looking for ideas. It immediately appealed. My background is in structuring and building websites, and I had used paper-based storyboarding in that context.

First trial: a single unit

Colleagues were enthusiastic and we started looking for contexts to trial it. An academic approached us with a view to involving us in significantly redesign a unit and we suggested the ABC approach.

As a tool for discussion, and for engaging a more diverse group of people – two academics, two learning technologists, one librarian, and someone else – it worked very well. They were very engaged and all could contribute. Although they couldn’t agree on a single tweet.

But we didn’t complete all the activities in the time. We also didn’t talk to them about how it should fit in to the overall development cycle and didn’t have much opportunity to follow up on what next. To me it felt like there was less value in talking about a single unit in isolation, that there would have been more benefit if we’d been working on a programme.

It was a useful tool and an enjoyable session but it wasn’t right yet.

Second trial: developing online courses

Not long after that we were asked to get involved in developing three online courses which would be promoted to our own students, as well as to the public more widely. Each course would be developed by a group of academics from a variety of different disciplines, many of whom had not worked together before.

The timescales were extremely short (by university standards). The academics involved were extremely busy with their existing work. These courses had to be innovative, transformative, cross-disciplinary, interlinked, approachable by anyone, essentially self-sustaining … and should encourage the development of transferable skills. No small ask.

Having pitched their ideas and been selected to lead or participate, the teams were assembled for an initial one day event. As part of this we ran several short sessions. We asked them to do an elevator pitch (they resolutely failed to follow the instructions on this). We also did a pre-mortem (imagine it’s a year down the line and these projects have been an absolute disaster, tell us what went wrong – very popular and a great way of surfacing problems and clearing the air).

We then ran an ABC workshop, with three tables myself and my colleagues Roger Gardner and Mike Cameron running a table each.

We modified the cards slightly to make them more platform-focused. We also added a time wheel to each week. Students would be expected to spend three hours a week in total on these courses and from conversations we’d had with the academics we knew that they were veering towards providing three hours of video a week (plus readings and activities). We wanted to focus attention on how students would spend their time.

We attempted to fit all this within an hour, because that was all the time available in the schedule.

For stimulating discussion, getting everyone to contribute, and shifting focus towards the student experience it worked well. The teams understood it and could work with it quickly. We were definitely over-ambitions about how much we could get through in an hour. Added to this, it was too early in the process and teams still had divergent or vague ideas about content (even on a big-picture scale) which couldn’t be resolved in a short time available.

One interesting finding was about the value of pushing people through the process. The other two facilitators used the framework and cards but took a more freeform approach, allowing discussions to run on. I was much stricter, pushing people through the activities. At the end of the day my group were the only one who asked to take the cards away and declared that they would use it themselves. Working through all the activities seemed to help people see the value of the process (though of course that may not mean that the discussion was more valuable).

Suzanne: Using ABC throughout online course design

My experiences of using the ABC method came later in the process of developing these online courses. My colleague Hannah O’Brien and I worked intensively with the three course teams, and we turned to ABC to help us do that. When we started, there were a lot of ideas, too many in fact(!), and we tried to find ways to get those ideas somehow on to paper, so that we could all evaluate them, and work them into a course design.

We ran a series of shorter, small group ABC sessions, using the modified cards from Suzi and Roger’s previous session. The courses were going to end up in the FutureLearn platform, so the course design by nature needed to work in a linear sequence of weeks of learning. In each week, we needed a series of ‘activities’, which were made up of different ‘steps’. Anyone familiar with FutureLearn can tell you that there isn’t a great deal of choice for what these steps are: a text article, a video, a discussion, a quiz, or a limited selection of external ‘exercises’.

What the ABC sessions highlighted early on for our teams was that having lots of video and articles explaining ideas might look jazzy, but is all very similar (and not very active) in terms of learning types. We all noticed there was far too much of the turquoise ‘acquisition’ happening in courses which were designed to develop skills such as communication and self-efficacy.

To help our academics come up with alternatives ideas for how students could, within the limits of FutureLearn, have a more interactive and challenging learning experience, we also created a bank of good examples, which we called our ‘Activity Bank’. As we worked to try and think of ideas for collaboration, or inquiry, for example, we could direct them to explore these examples, and adapt the ideas for their own purposes.

Overall, the ABC ended up being a useful tool to get everyone talking about the pedagogical choices they were making in a similar way. We could map the learning experience quickly and visually, so that we could prototype, evaluate and  iterate course designs. It also kept us all clearly focused on what the learners were doing during the course, rather than how amazingly we were presenting the materials.

Since then, I’ve found myself returning to the ABC tools and ideas regularly. The learning type ‘colours’ got quite embedded in our way of thinking and documenting learning designs. They cropped up in a graphic course design map created to demonstrate the pedagogical choices for the online courses (see below), and are now doing so again in a different context.

This new context, and the next big project for me is the Bristol Futures Optional Units. These are blended, scalable, credit bearing, multidisciplinary, investigative units, open to all students, around the Bristol Futures themes of Global Citizenship, Innovation and Enterprise and Sustainable Futures. So, no small ask, once again.

For this, the ABC cards have been tweaked again, this time to generate ideas for both online and face-to-face ideas for course elements, to allow for a flexible and student-choice driven learning experience. How can we provide a similar learning experience for students who might end up taking the unit in very different ways? We’re in the early days of course design, but I imagine that we’ll end up using the ABC workshops in various forms during the coming year!

In all, the ABC has become a bit of an ace up our sleeves. When we need temas to work more collaboratively, when we need the focus shifted back to the student, when we need to make progress rapidly and efficiently, even when we come to evaluate learning design – the ABC tools seem to provide us with a way to talk, act, design, and iterate.

Reflections on the ABC mini-conference from Suzi

On Friday 9 March myself and my colleague Suzanne Collins made our way to UCLs London Knowledge Lab, round the back of Lambs Conduit Street, to attend a mini-conference on the ABC curriculum design methodology developed by Clive Young and Nataša Perović.

It’s something we’ve been using an adapted version of at Bristol for just over a year, so it was great to see Clive and Nataša in action at the masterclass, and to hear about the great work being done at Glasgow, Canterbury Christ Church and Reading.

Some useful points from the day:

  • Glasgow have been using an online tool to make an electronic version (and have templates available)
  • Canterbury Christ Church have used PowerPoint to create an electronic copy while the workshop runs
  • Other coloured stars have been added to make visible: places where they engage with the education strategy; developing employability skills; other priorities (identified by the course teams)
  • Who is in the workshop is critical. Do you have students? Library staff? A critical friend?
  • It’s not just us – everybody adapts the cards (sometimes they even change the colours).

During the morning session people talked about using the cards with students, to allow them to design the course. One speaker suggested using them with evidence of BAME / gender engagement (in different types of activity), to address the way the course works for different learners. It was great to see how quickly people picked up the idea and started taking it on as their own.

Lots of potential and positivity. I look forward to seeing how the network grows.

Digital and physical spaces – notes from the reading group

Amy read – Institutional to Individual: realising the postdigital VLE? by Lawrie Phipps.

Lawrie starts this article by quoting himself – ‘Digital is about people’. He believes that learning is effective when we are connection in conversations and in groups – this is been proven many times over – but that these conversations should not be confined. The ‘confinement’ he talks of is the attempt by unnamed institutions to restrict their teaching staff by controlling the access and provision of alternative tools, which, Lawrie argues, don’t often align with their everyday activities. He mentions two projects that are taking places at universities – the Personalised User Learning & Social Environments (PULSE) project at Leeds Beckett (difficult to find anything about this online) and the Aula team, who have created a ‘conversational layer’ to run alongside a VLE and provide an ‘ecosystem for a range of other tools’.

The article moves on to discuss the emerging trend of disaggregation as being an indicator of ‘post digital academic practice’… I’d be interest to know what he means by this but the article does not shed any light on this. If I were to guess at its meaning I would think that the digital is becoming so integrated into our lives that it can no longer be considered a practice – it is seamless, and therefore doesn’t need to be recognised. He reminds us to be mindful of the other emerging themes of digital spaces; control, surveillance and ‘weaponised’ metrics used by corporate bodies and the use of algorithms to control our feeds.

Lawrie finishes by letting us know that ‘the report’ (I presume the ‘Next Generation Digital Learning Environments’, mentioned earlier in the article) is coming together nicely, and urges the reader get in touch if they have any relevant cases of disaggregation for practical purposes.

 

Chrysanthi read Digital sanctuary and anonymity on campus by Sian Bayne.

The article is trying to make a case for anonymity in online social exchange in the context of higher education.

The author points out that as part of the point of higher education is to help students own and defend their knowledge, anonymity is not usual, barring exceptions like peer review. But this works better for those with privileges than those without and it doesn’t work for every topic that a student would be interested in. In their view, anonymity offers 1. social value and 2. a way to resist digital surveillance. By looking at the use of an anonymous social media app called Yik Yak – which was popular for 2-3 years, but then removed the anonymity and then closed – they realised that it was a tool often used to facilitate anonymous peer support, which was very helpful to students concerning topics like social difficulties or isolation, relationships, health (sexual and emotional) or teaching-related issues.

Anonymity also serves to resist the ubiquitous surveillance that occurs in large part through social media, that record everything individuals do and like for their financial benefit. But there can be online social networks where students don’t need to hand over their data to be able to use them.

They argue that the absence of an app like this reduces students’ opportunities to a support group and that the counter argument usually put forward – that anonymous spaces facilitate abuse – is weak, considering abuse can and does happen everywhere, including non anonymous social media like Facebook. They are concerned about where the supportive conversations that people would previously have anonymously are happening now, for topics like mental health or relationships.  Overall, they believe universities need such anonymous spaces and should figure out how to implement them balancing data, trust & safety.

I think the author makes good points. Regarding where the conversations are happening now, I am assuming

1. other anonymous but not higher education specific spaces, such as reddit, which means people will get support, although from a broader population that is not coming from the same context, with all the challenges this implies.

2. non anonymous spaces, like Facebook, which means people are essentially broadcasting their issues on platforms that a. may use this information for their benefit and the student’s detriment, b. store and display the data with the user’s name for a long time, with no guarantees for who can/ cannot see it. This makes abuse easier, as well as enabling people looking up the individual (e.g. future employers) to see information they should otherwise not have access to.

3. they are not getting support, which could lead to isolation.

Overall, I do see the point of universities implementing anonymous digital spaces for their students.

 

Naomi read: The SCALE-UP Project: A Student-Centred Active Learning Environment for Undergraduate Programmes by Robert J. Beichner.

The author starts by describing these scale-up areas as places where ‘student teams are given interesting things to investigate, while their instructor roams.’ Although this is one of the short areas we hear about how the actual space is designed to improve learning and collaboration.The purpose of these teaching spaces is to encourage discussion between student’s and their peers. By working in small groups on separate tables within the classroom student’s can work on separate activities and use a shared laptop or whiteboard to research or make note of their findings. Thay can then discuss with other groups.

The main point of the paper revolves around the idea of social interaction between students and their teachers being the ‘active ingredient’ in making this approach to teaching work. Beichner talks about how student’s in these classes gain a better conceptual understanding then the student’s taking traditional lecture-based classes. Studies saw a high rise in student’s confidence, their problem-solving skills, as well as teamanship and communication. There is some concern about whether this approach is meaning less content is being delivered to the students, but Beichner argues the content is being developed and created by the student’s themselves.

Discussion led learning is always going to be popular, but we need to think about the physical space too and whether it is needed or not. The size of these classes needs to be considered too – what can be classed as too big? Beichner’s study was interesting, but not surprising, and it would have been good to know how the design or the space and tables aiding the learning too.

 

Suzi read The Educause NDingle and an API of one’s own by Michale Feldstein (which is a rebuttal to a rebuttal of the Educause Next Generation Digital Learning Environment report)

This is a clear and interesting artical discussing where learning management systems (LMSs) could/should go – as digital spaces for learning. The perspective on this is relatively technical, discussing the underlying architecture of the system but the key ideas are very approachable:

LMSs could move from being one application that tries to do everything, to being more like an oporating system on a mobile phone – hosting apps and managing the ways they can communicate with each other

Lego is also used as a metaphor for this more adaptable LMS, but Feldstein discusses the tension between having fairly generic blocks that don’t build anything in particular but allow you to be very creative (Lego from my childhood), and having sets which are intended to build a particular thing but which are then less adaptable (more typical of modern Lego). I found this a harder idea to apply, though I can appreciate that just because something comes in blocks and can be taken apart, doesn’t mean it is genuinely flexible and adaptable.

Personal ownership of data is discussed – the idea of students even hosting their own work and having a personal API via which they grant the institution’s LMS (and hence teachers) access to read and provide feedback on their work (“an API of one’s own”). This seems to me an attractive idea, in a purist origins-of-the-web way. People have suggested similar approaches in various domains, social media in particular, and I don’t know of any that have worked.

 

Suzanne read Semiotic Social Spaces and Affinity Spaces From The Age of Mythology to Today’s Schools by James Paul Gee. The premise of this text is to reconsider the idea of a community of practice, to think about it as related to the space in which people interact (and in what way), rather than membership of the community (particularly membership given to people by others, or through arbitrary groupings). Gee argues that thinking about community in this way is more useful, as membership means so many different things to different people, so trying to decide who is ‘in’ or ‘out’ of a group is problematic. He explains his ‘alternative’ to thinking about a ‘community of practice’ as an ‘affinity space’ in quite a lot of detail, using the analogy of a real-time computer game as an example, which here I won’t try to explain fully. However, some key ideas around what makes an ‘affinity space’ are that there needs to be some kind of content, generated by the community around a common endeavour. The people who interact with this content do so with an agreed set of ‘signs‘ with their own particular ‘grammar‘ or rules. This grammar can be internal (signs decided on within the group), or external (eg the way that people’s beliefs and identities are formed around these signs, and their relationship with them), and the external grammar can influence the internal grammar. Another interesting aspect is the idea of portals. An affinity space will have a number of ways that people can interact with it. To take the game example, the game itself could be a portal, but so could a website about game strategy, or a forum discussing the game. Importantly, the content, signs and grammar of the space can be changed by those interacting through those portals, so the content is not fixed. The final points are that people interacting in the space are both ‘expert’ and ‘novice’, and both intensive and extensive knowledge is valued. Individuals with specific skills or who a great amount of knowledge about a specific thing are as valued by the space as those who work to build a more distributed community of knowledge, and there are many different ways people can participate. Gee’s text presents quite an in depth concept, which seems quite theoretical. However, thinking about something like the Bristol Futures themes (Global Citizenship, Innovation and Enterprise or Sustainable Futures), we discussed how it might be applied, and how it might help us to think about things like reward and recognition, or success measures, in a very different way.

Suggested reading

Notes from the recent GW4 meeting at Cardiff University

Last Friday, Han, Mike and I attended a GW4 event in Cardiff, where the main topics on the agenda were students as collaborators, and shared projects that we can embark on together.

The day started with brief updates from each team:

Cardiff…

  • Have a new space for learning and teaching experimentation
  • Are working on a Curriculum Design Toolkit, as part of which they are looking at unbundling content to work in different ways for different markets
  • Have a Learning Hub Showcase (http://www.cardiff.ac.uk/learning-hub)
  • Have funding that students can bid for, for teaching projects
  • Ran a Summer Intern Project – one of which focused on advice in how to use lecture capture
  • Had a blank course rollover with a new minimum standard

Bath…

  • Have a major curriculum design project upcoming
  • Are moving towards programme-level assessment rather than modular
  • Have new funding for staff and students to work together
  • Are championing a flipped learning approach
  • Have a placement student
  • Are working on a ‘Litebox project’ (http://blogs.bath.ac.uk/litebox/) where students create an environment where the whole University can learn about new and existing technologies for use in learning and teaching, and share their experiences of them
  • Are expanding their distance learning postgraduate numbers


During the second part of the day, we talked about students as producers. Splitting into small groups, we shared our experiences, challenges and tips for working with students on both accredited and unaccredited courses. It was widely accepted by the group that collaboration with students is mutually beneficial. Students are able to move from being passive consumers of knowledge to genuine partners in their education, and we as professionals have a lot to gain from the expertise, connections with other students, and knowledge of life at the university that students can offer us.

The experience of working with students a Bristol, Bath and Cardiff has been positive but limited. All three universities have hired student interns in the past, but would like to do more in terms of making ‘students as producers’ a key underpinning concept in accredited courses. The expectations of ‘what university learning will be like’ puts a dent in the willingness of students to engage with accredited collaborative projects. We discussed how students may see universities as institutions of teaching rather than of learning, particularly as  tuition fees have risen, and expect more teacher-to-student time for their money. Our group talked about introducing the idea of innovative learning techniques earlier in students’ degree programmes, and even on pre-university open days, in order to change the expectations of students from traditional lecture-based learning to problem-based modules and more.

For the last part of the day, we talked about projects that the GW4 could collaborate on, and contributed to this padlet board. We all shared ideas, then each of us cast three votes for the projects we’d like to see most. A common theme was the sharing of knowledge and expertise in areas like FutureLearn, ePortfolios and case studies. We also talked about working together to put pressure on companies or to bid for shared funding in order to improve practice in ways that wouldn’t be possible for a single institution.

Bath have volunteered to host the next meeting in February or March, in which we’ll talk about ePortfolios and assessment.

Programme level assessment – notes from the reading group

Suzi read Handbook from UMass – PROGRAM-Based Review and Assessment: Tools and Techniques for Program Improvement

A really clear and useful guide to the process of setting up programme level assessment. The guide contains well-pitched explanations, along with activities, worksheets, and concrete examples for each stage of the process: understanding assessment, defining programme goals and objectives, designing the assessment, selecting assessment methods, analysing and reporting. Even the “how to use this guide” section struck me as helpful, which is unheard of.

The proviso is that your understanding of what assessment is for would need to align with theirs, or you would need to be mindful of where it doesn’t. As others do, they talk about assessment to improve, to inform, and to prove and they do also nod to external requirements (QAA, TEF, etc in our context). However, their focus is on assessment as part of the project of continual (action) research into, and improvement of, education in the context of the department’s broader mission. This is a more holistic approach that might bring in a wide range of measures including student evaluations of the units, data about attendance, and input from employers. I like this focus but it might not be what people are expecting.

During the group we discussed the idea of combining some of the ideas from this, and the approach Suzanne read about (see below). A central team would collaborate with academic staff within the department in what is essentially a research project, supporting conversations between staff on a project, bringing in the student voice and leaving them with the evidence-base and tools to drive conversations about education in their context – empowering staff.

(Side note – on reflection I’m pretty sure this is the reason this particular reading appealed to me.)

Chrysanthi read Characterising programme‐level assessment environments that support learning by Graham Gibbs & Harriet Dunbar‐Goddet.

The authors propose a methodology for characterising programme-level assessment environments, so that they can later be studied along with the students’ learning.

In a nutshell, they selected 9 characteristics that are considered important either in quality assessment or for learning (e.g. variety and volume of assessment). Some of these were similar to the TESTA methodology Suzanne described. They selected 3 institutions that were different in terms of structure (e.g. more or less fixed, with less or more choice of modules, traditional or variety in assessment methods etc. They selected 3 subject areas, the same in all institutions. They then collected data about the assessment in these and coded each characteristic so there would be 3 categories: low, medium, high. Finally, they classified each characteristic for each subject in each institution according to this coding. They found that the characteristics were generally consistent within institution, showing a cultural approach to assessment, rather than a subject- related one. They also identified patterns, e.g. that assessment aligned well with goals correlates with variety in methods. While the methodology is useful, their coding of characteristics as low-medium-high is arbitrary and their sample small, so the stated quantities in the 3 categories are not necessarily good guidelines.

Chrysanthi also watched a video from the same author Suzanne read about: Tansy Jessop: Improving student learning from assessment and feedback – a programme-level view (video, 30 mins).

There was a comparison of 2 contradictory case studies, 1 that seemed like a “model” assessment environment, but where the students did not put in much effort and were unclear about the goals and unhappy, and 1 that seemed problematic in terms of assessment but where students knew the goals and were satisfied. The conclusion was that rather than having a teacher plan a course perfectly and transmit large amounts of feedback to each student, it might be worth encouraging students to construct it themselves in a “messy” context, expanding constructivism to assessment as well.

Additionally, as students are more motivated by summative assessment, have a staged assessment where students are required to complete some formative assessment that feeds into their summative assessment. Amy & Chris suggested that this has already started happening in some courses.

Finally, the speaker noted that making the formative assessment publicly available, such as in blog posts, motivates the students, that it would be better if assessment encouraged working steadily throughout the term, rather than mainly at peak times around examinations and that feedback is important for goal clarity and overall satisfaction.

Both paper and video emphasised the wide variety in assessment characteristics between different programs. In the paper’s authors’ words, “one wonders what the variation might have been in the absence of a quality assurance system”.

The discussion went into the marking system and the importance students give to the numbers, even when they are often irrelevant to the big picture and their future job.

Amy summarised a summary she had created after attending a Chris Rust Assessment Workshop at the University. The workshop focussed on the benefits of programme-level assessment, looking at the current problems with assessment in universities and offering practical solutions and advice on creating programme-level assessments. The workshop started by looking at curriculum sequencing – it’s benefits and drawbacks, and illustrated this with examples where it had been successful.

Chris then discussed ‘capstone and cornerstone’ modules as a model for programme-level assessment, and explain where it had been a success in other universities. He discussed the pseudo-currency of marks and looked at ways we can alter our marking systems to improve student’s attitude to assessments and feedback. He ended the session by looking at the ways you can engage students with feedback effectively, and workshop attendees shared their advice with colleagues on how they engage their students with feedback. You can find the summary here.

Suzanne read Transforming assessment through the TESTA project by Tansy Jessop (who will be the next Education Excellence speaker) and Yaz El Hakim, which briefly describes the TESTA project, the methods they use and the outcomes they have noted so far. There are also references within the text to more detailed publications on specific areas of the methods, or on specific outcomes, if you want to find out more detail.

In brief, the TESTA project started in 2009, and has now expanded to 20 universities in the UK, Australia and the Netherlands, with 70 programmes having used TESTA to develop their assessment. The article begins by giving a pretty comprehensive overview of the reasons why programme assessment is so high on the agenda, including the recognition that assessment affects student study behaviours, and that assessment demonstrates what we value in learning, so we should make sure it really is focused on the right things. There was also a discussion about how the ‘modularisation’ of university study has left us with a situation of very separated assessments, which make it difficult to really see the impact of assessment practices across a programme, particularly for students who take a slower approach to learning. Ultimately the TESTA project is about getting people to talk about their practices on a ‘big picture’ level, identity areas which could be improved, and then work from a base of evidence to make those improvements. There is a detailed system of auditing current courses, including sampling, interviews with teaching s and programme directors, student questionnaires, and focus groups. the information from this is then used as a catalyst for discussion and change, which will manifest differently in each different programme and context.

The final paragraph of the report sums it up quite well: “The value of TESTA seems to lie in getting whole programmes to discuss evidence and work together at addressing assessment and feedback issues as a team, with their disciplinary knowledge, experience of students, and understanding of resource implications. The voice of students, corroborated by statistics and programme evidence has a powerful and particular effect on programme teams, especially as discussion usually raises awareness of how students learn best.”

Suggested reading

Games and Simulation enhanced Learning (GSeL) Conference

GSeL Logo - 8bit style graphics.Last Thursday I caught the 8.44am cross country to Plymouth to attend the first Games and Simulation enhanced Learning (GSeL) Conference. GSeL is a newly formed interdisciplinary research theme group, part of Plymouth University’s Pedagogic Research Institute and Observatory (PedRIO).

VR Hackathon

Plymouth University 2nd November 2017

The main event was on Friday the 3rd, but a session billed for the previous day – ‘Hackathon: VR for Non-programmers’ sounded promising. So, I ventured down a day early to channel my inner geek. I’ve got a basic (but rusty) understanding of coding so hoped that the ‘non-programmers’ tagline was true. Turns out the session was well designed for those with little to no experience. Michael Straeubig expertly guided around 15 attendees with differing skills through the process of creating a simple VR equivalent of ‘Hello World’ over the course of 2 hours.

The Hackathon was a hands on workshop running through downloading a-frame framework template project files from Michael’s github, installing open source software atom.io for editing/coding.

Sounds complicated? Yeah, sort of – but Michael’s laid-back-whilst-enthusiastic delivery helped fill in the gaps and moved at a steady pace we could all keep up with. He guided us through creating our first scene, adding in various 3-dimensional objects, altering their size and colour. Setting up a local server on our laptops via atom.io, we were able to move beyond viewing the 3D space we’d programmed and view it on a pair of budget VR goggles (Google Cardboard) on our smartphones.

It was a great primer for dipping toes/feet/legs into creating simple VR spaces from scratch using free tools. The a-frame project files supplied had additional examples of how to extend and develop. I don’t mind admitting I spent a large part of the rest of the day tinkering. A Michael drily observed during the session, we’d become ‘cool coders’.

Games and Simulation enhanced Learning (GSeL) Conference

Plymouth University 3rd November 2017

A day of talks and workshops based around the use of Games and Simulations, both real-life and digital. Some personal highlights for me included:

Professor Nicola Whitton’s keynote ‘Play matters: exploring the pedagogic value of games and simulation’ which tapped eloquently into themes like Failing without Consequence and motivation/engagement through playing games.

Matthew Barr (University of Glasgow) ‘Playing games at University: the role of video games in higher education and beyond’ – a great talk about his work with ‘gaming groups’ and the benefits cooperative video game playing brought students. “If I ruled the world, every student would play Portal 2”.

James Moss (Imperial College) ‘Gamification: leveraging elements of game design in medical education’ – some brilliant examples of using scenario based games in medical education. ‘Stabed to Stable’ involved scenario/persona based learning, a horizontal whiteboard, post-it notes and pens, with students clustered around trying to map out processes (checks/actions) they needed to go through, whilst James periodically helped guide or threw related spanners into the works. An overhead time-lapse video showed a dynamic session in action. A second game involved teams becoming the ‘medical officer’ helping a team of characters climb Everest. This simulation included mountain noise recordings (incrementally getting louder), random wildcards presenting challenges, lighting changes and James squirting participants in the face with water.

Michael Parsons (University of South Wales) ‘Keeping it Real: Integrating Practitioners in a Public Relations Crisis Simulation’ – shared his experience running a real-time simulation for PR students. Students attempted to handle a recreation of the infamous Carnival Triumph ‘Poop Cruise’ in the University’s Hydra Minerva Suite. The simulation used news report recordings, archived social media posts and live interaction with actors via telephones over several ‘acts’ to simulate a PR teams attempts to handle a particularly disastrous voyage. It all went well till the passengers were close enough to land to get mobile phone reception (and access to social networks).

The conference presented a feast of examples of using games and simulations in teaching and learning. From creating crosswords to utilising digital badges to recognise achievements to data visualisation in Virtual Reality, the place was abuzz with ideas. The focus on the potential of play and gaming to engage students meant the event had something for everyone, whether die-hard techy or strictly analogue.

Role and future of universities – notes from the reading group

Maggie read Artificial intelligence will transform universities. Here’s how – World Economic Forum
The article presents the idea that Universities have created a need to innovate and evolve to meet the changing needs caused by the upsurge in AI. Already, the marking of student papers is becoming a thing of the past as AI is able to assess and even ” flag up” issues with ethics.Students are less able to distinguish between teacher marking and that of a “bot”. Teaching is additionally being impacted as students are able to undertake statistics courses using AI, massively reducing learning (and human teacher) time, with apparently equal learning and application outcomes. The author argues that Universities will need to up their game regarding employability and indeed attractive employment (remuneration). The paper is an easy-to-read item and clearly outlines the range of benefits and subsequent issues in relation to AI. All pertinent.

Suzi read three short opinion pieces: What are universities for and how do they work? by Keith Devlin, Everything must be measured: how mimicking business taints universities by Jonathan Wolff, and Universities are broke. So let’s cut the pointless admin and get back to teaching by André Spicer.

Devlin focused largely on the role of research within maths departments. The most interesting part, for me, came at the end when he talked about universities as communities and learning occurring “primarily by way of interpersonal interaction in a community”. Even without thinking about research outputs, there is value then in having a rich and varied community with faculty who have deep love and enthusiasm for their subject.

Wolf provided a clear and compelling dissection of how current educational policy is creating adverse incentives to community-mindedness (both within and between universities). Something detrimental to the education sector, which is such a significant part of the UK economy.

Spicer provides an insight into how this feels as an academic. He talks about how “In the UK, two thirds of universities now have more administrators than they do faculty staff.” and describes academics are “drowning in shit” (pointless admin).

For me, Spicer’s solutions for what universities could do to change this weren’t so compelling. If I could change one thing I would look at how we cost (or fail to cost) academic staff time. Academics can feel that they are expected to just do any amount of work they are given, or at least they often have no clear divide between work and not-work and have to constantly negotiate their time.

Amy didn’t read, but watched Why mayors should rule the world – Benjamin Barber – and would highly recommend it. Our modern democracy revolves around ancient institutions – we elect leaders most of us never meet and feel like we have very little input into the democratic process. This isn’t the case in cities – the leaders of cities, mayors, are seldom from anywhere other than the city they look after. They went to the local schools, they use the public transport and hospitals – they’ve watched their city grow. They have a vested interest in improving it. Positive changes towards existential issues such as climate change and terrorism are happening in cities (he gives an example of the LA port, which after an initiative to clear up, reduced the city’s overall emissions by 20%), and something can be learnt from the way that they operate. There are networks of mayors across the world, with a sense of competitiveness between them as to who can be the best city. Mayors from different cities meet up and share their practices, helping other cities implement changes using best practice, without the bureaucracy of central government slowing change down.

Suzanne watched  What are universities for? the RSA talk by Professor Stefan Collini and  Professor Paul O’Prey. The second half of the video was more focused on the way that higher tuition fees have changed the nature of the relationship between universities and students, but the introduction to the talk was much more on the topic we were discussing today. Stefan Collini began by saying that he believes universities are partially protected spaces which prioritise ‘deepening human understanding’, and that there are few if any other places where this happens. He compared them to other organisations which do research, such as R&D departments in industry, or teams working in politics, but said the difference was that universities were able to follow second order enquiries, and look at the boundaries of topics and knowledge, as they didn’t have a primary purpose of furthering one particular thing or ideal. So, although there are many benefits, such as increased GDP, from the kind of enquiry universities do (the ‘deepening of human understanding’ he started out with), that isn’t their aim or goal. He also went on to say that although we tend to see universities as primarily for the benefit of the individual students (furthering their careers, developing their own skills and knowledge) they should be seen as providing public good as well, for the reasons outlined above. In the group we discussed his basic premise, that universities are ‘protected spaces’, and decided that we aren’t sure that is really the case (especially with so much research being funded by grants from industry). However, it did lead to an interesting discussion about what we feel universities are actually for, if they aren’t what Collini outlined.

Suggested reading