Curriculum theories – notes from reading group

Thanks to Sarah Davies for setting us some fascinating reading!

Connected curriculum chapter 1 (notes from Chris Adams)

The connected curriculum is a piece of work by Dilly Fung from UCL. It is an explicit attempt to outline how departments in research-intensive universities can develop excellent teaching by integrating their research into it; the ‘connected’ part of the title is the link between research and teaching. At it’s heart is the idea that the predominant mode of learning for undergraduates should be active enquiry, but that rather than students discovering for themselves things which are well-established, they should be discovering things at the boundaries of what is known, just like researchers do.

It has six strands:

  • Students connect with researchers and with the institution’s research. Or, in other words, the research work of he department is explicitly built into the curriculum
  • A throughline of research activity is built into each programme. Properly design the curriculum so that research strands run though it, and it builds stepwise on what has come before.
  • Students make connections across subjects and out to the world. Interdisciplinarity! Real world relevance.
  • Students connect academic learning with workplace learning. Not only should we be teaching them transferable skills for a world of rapid technological change, but we need to tell them that too.
  • Students learn to produce outputs – assessments directed at an audience. Don’t just test them with exams
  • Students connect with each other, across phases and with alumni. This will create a sense of community and belonging.

This last point is then expanded upon. Fung posits that the curriculum is not just a list of what should be learned, but is the whole experience as lived by the student. Viewing the curriculum as a narrow set of learning outcomes does not product the kind of people that society needs, but is a consequence of the audit culture that pervades higher education nowadays. Not all audit is bad – the days when ‘academic freedom’ gave people tenure and the freedom to teach terribly and not do any research are disappearing, and peer-review is an integral part of the university system – but in order to address complex global challenges we need a values based curriculum ‘defined as the development of new understandings and practices, through dialogue and human relationships, which make an impact for good in the world.’

I liked it sufficiently to buy the whole book. It addresses a lot of issues that I see in my own department – the separation of research from teaching, and the over-reliance on exams, and the lack of community, for example.

Connected curriculum chapter 2 (notes from Suzi Wells)

As mentioned in chapter 1, the core proposition is that the curriculum should be ‘research-based’ – ie most student learning “should reflect the kinds of active, critical and analytic enquiry undertaken by researchers”.

Fung gives a this useful definition of what that means in practice. Students should:

  • Generate new knowledge through data gathering and analysis
  • Disseminate their findings
  • Refine their understanding through feedback on the dissemination

All of it seems fairly uncontroversial in theory and tends to reflect current practice, or at least what we aspire to in current practice. There’s some discussion of the differences in what research means to different disciplines, and how that filters through into assessment of students, and potentially some useful studies on just how effective this all is.

Fung mentions the Boyer Commission (US 1998) and its proposed academic bill of rights, including (for research intensive institutions): “expectation of and opportunity for work with talented senior researchers to help and guide the student’s efforts”. Given increasing student numbers, this is possibly a less realistic expectation to meaningfully meet than it once was.

There’s some useful discussion about what is needed to make research-based-teaching work.

I was particularly interested in the idea that providing opportunity for this form of learning isn’t everything. Socio-economic factors mean that students may have differing beliefs about their own agency. Fung cites Baxter-Magdola (2004) on the importance of students having ‘self-authorship’ which includes ‘belief in oneself as possessing the capacity to create new knowledge’ and ‘the ability to play a part within knowledge-building communities’. You can’t assume all students arrive with the same level of this, and this will affect their ability to participate.

This part of the chapter also talks about the importance of not just sending students off “into the unknown to fend for themselves” – imagine a forest of ivory towers – but to give them support & structure. Activities need to be framed within human interactions (including peer support).

Towards the end there is a nod to it being anglo-centric – African and Asian educational philosophy and practice may be different – but little detail is given.

How Emotion Matters in Four Key Relationships in Teaching and Learning in Higher Education” (notes from Roger Gardner)

This is a 2016 article by Kathleen Quinlan, who is now Director of the Centre for the Study of Higher Education and Reader in Higher Education at University of Kent, but was working at Oxford when this was written.

She writes that while historically there has been less focus on Bloom’s affective domain than the cognitive, recently interest in the relation of emotions to learning has been growing although it is still under-researched. The article comes out of a review of the existing literature and conversations with teachers at the National University of Singapore in August 2014.

The paper focusses on four relationships: students with the subject matter, teachers, their peers and what she calls “their developing selves”. For each section Quinlan includes a summary of implications for teaching practice, which provide some very useful suggestions, ranging from simple things such as encouraging students to introduce each other when starting activities to help foster peer relationships, to advocating further research and exploration into when it is appropriate and educationally beneficial for teachers to express emotions and when not.

Quinlan says “discussions about intangibles such as emotions and relationships are often sidelined”, but it now seems essential to prioritise this if we are to support student wellbeing, and this paper provides some helpful prompts and suggestions for reflection and developing our practice.  If you are short of time I recommend looking at the bullet point “implications for practice”.

What is “significant learning”? (notes from Chrysanthi Tseloudi)

In this piece, Dr. Fink talks about the Taxonomy of Significant Learning; a taxonomy that refers to new kinds of learning that go beyond the cognitive learning that Bloom’s taxonomy addresses. The taxonomy of significant learning – where significant learning occurs when there is a lasting change in the learner that is important in their life – is not hierarchical, but relational and interactive. It includes six categories of learning:

Foundational knowledge: the ability to remember and understand specific information as well as ideas and perspectives, providing the basis for other kinds of learning.

Application: learning to engage in a new kind of action (intellectual, physical, social, etc) and develop skills that allow the learner to act on other kinds of learning, making them useful.

Intergration: learning to see, understand, and make new connections between different things, people, ideas, realms of ideas or realms of life. This gives learners new (especially intellectual) power.

Human Dimension: learning about the human significance of things they are learning – understanding something about themselves or others, getting a new vision of who they want to become, understanding the social implications of things they have learned or how to better interact with others.

Caring: developing new feelings, interests, values and/ or caring more about something that before; caring about something feeds the learner’s energy to learn about it and make it a part of their lives.

Learning how to learn: learning about the learning process; how to learn more efficiently, how to learn about a specific method or in a specific way, which enables the learner to keep on learning in the future with increasing effectiveness.

The author notes that each kind of learning is related to the others and achieving one kind helps achieve the others. The more kinds of learning involved, the more significant is the learning that occurs – with the most significant kind being the one that encompasses all six categories of the taxonomy.

Education Principles: Designing learning and assessment in the digital age (notes from Naomi Beckett)

This short paper is part of a guide written by Jisc. It covers what Education Principles are and why they are such a vital characteristic of any strategy. Coming from someone unspecialised in this area it was an interesting read to understand how principles can bring staff together to engage and develop different education strategies. The guide talks about how principles can ‘provide a common language, and reference point for evaluating change’.

The paper talks about having a benchmark in which everyone can check their progress. I like this idea. So often projects become too big and the ideas and values are lost on what was first decided as a team. Having a set of principles is a way to bring everything back together and is a useful way to enable a wide variety of staff to engage with each other. The guide mentions how having these principles means there is a ‘common agreement on what is fundamentally important.’

Having these principles developed at the beginning of a project puts the important ideas and values into motion and is a place to look back to when problems arise. Principles should be action oriented, and not state the obvious. Developing them in this way allows for a range of staff members to bring in different ideas and think about how they want to communicate their own message.

I also followed up by reading ‘Why use assessment and feedback principles?’ from Strathclyde’s Re-Engineering Assessment Practices (REAP) project.

Suggested reading

OU Innovating Pedagogy 2019 – notes from reading group

All read sections of the OU Innovating Pedagogies 2019 report

Learning with Robots (read by Naomi Beckett)

This short piece talked about how Robots are now being used for educational purposes and which ones are being used. The article talked a lot about how Robots can be used in a way that enhances learning by learning things themselves. Learners can teach something to the Robots as a process of showing they have accomplished a new skill and in turn the Robot is gaining new information.

The article also talked about how Robots can enable a passive approach to teaching. Robots won’t raise their voice or show (real) emotions in a session. Having this calm approach to teaching, it is argued, will now allow students to learn in a calmer environment. It also discusses how having a Robot as a learning tool may excite or motivate learners too. Although it only briefly mentions how a Robot would souly conduct a class full of students.

There were some aspects of the article that did make some sense on how Robots could aid learning, but these ideas didn’t go into much depth. It was discussed how Robots could talk in several languages so could be able to converse comfortably with a wider range of students. It also talked about how Robots could act as mediators to students, being able to check in, or provide advice at any time of the day. They could conduct the routine tasks and issues, freeing up teacher’s time so they can spend it with their learners.

As mentioned in the article ‘many people have an inherent distrust of advancing technologies.’ There are several questions to ask on how much a Robot is integrated into a learning environment, and when does it become too much. But there are a number of interesting points in the article about how Robots are making small steps to aid and enhance learning.

Reading this section got me thinking about the AV1 Robot. A robot created by NoIsolation. They created a robot to ‘reduce loneliness and social isolation through warm technology’. AVI was a Robot created for children who are too ill to go to school. The robot sits in the class and the child at home can connect through it. Using an app, the children can take part in the classroom. They can raise their hand to answer questions, talk to nearby students, ask questions, and just listen if they want to. A great use of technology to keep students engaged with their learning and classmates.

Decolonising learning (read by Sarah Davies)

This section was not about decolonising the curriculum – itself an important area for Bristol – but rather reflecting on how digital environments, tools and activities can be used in ways which invert power relationships and cultural and educational capital of the dominant culture, and support colonised or marginalised populations in education, sense-making and cultural development which is meaningful to them. It notes that decolonisation requires systematic unsettling change.

The article reminds us that we need to acknowledge the ways in which digital presence can contribute to colonisation – so digital environments created by a dominant culture may not create spaces for the kind of discussions, activities and issues which are meaningful to those of other cultures. It suggests that MOOCs can often be a form of digital colonisation – people from all over the world learn from massive courses produced in just a few countries.

In contrast, digital decolonisation considers how to support colonised, under-represented, uprooted or otherwise marginalised people with technology in order to:

  • connect them with a shared history,
  • support a critical perspective on their present,
  • provide tools for them to shape their futures.

But how to use the technology must be decided by the people themselves.

Critical pedagogies – in which students are expressly encouraged to question and challenge power structures, authority and the status quo – provide frameworks for the academic success of diverse students – eg by seeking to provide a way of maintaining their cultural integrity while achieving academic success, or to sustain the cultural competence of their community while gaining access to the dominant cultural competence.

Digital storytelling is an example of a pedagogical tool that can be used for decolonising purposes – empowering students to tell their own stories, turning a critical lens on settler colonialism, capturing stories of indigenous or marginalised people taking action on issues, critiques of colonial nations.

Two final messages from this article which resonated for me were that success in or after HE for some groups of students may be at odds with notions of success in the dominant society (as captured in things like Graduate Outcomes); and that education needs to be reimagined as an activity that serves the needs of local communities – though what that means for Bristol and the local, national and international communities it exists within, I’m not sure.

Virtual studios (read by Suzi Wells)

I found this a useful exercise in thinking about what a studio is and what it is for – and how much of that might be reimagined online. Studios are described in the report as collaborative, creative, social, communal spaces. They contain creative artefacts (sketches, models, objects). Learning in studios is by doing and is often peer-supported with tutors facilitating and guiding rather than instructing.

The report describes virtual studios as being focused on digital artefacts. “Virtual studios are all about online exchange of ideas, rapid feedback from tutors and peers, checks on progress against learning outcomes, and collaboration”

The first benefit of virtual studios given is scale: a studio can be for 100s of learners. This left me wondering if this is in conflict with the idea of studios as a community.

Virtual studios are also described as “hubs”, an idea I would have liked to explore further. I wanted to know how a hub is different from a community. What are we trying to achieve when we make something hub-like? I suppose a hub is a place which provides a starting point or a loose join between disparate activities or organisations. It’s not just a community, but has other communities floating around it.

Virtual studios can be a way to give more people (fully open even) access to experts and facilities. Example given was the (oft cited, so fairly unique?) ds106 Digital Storytelling.

Areas to explore further:

  • Could e-portfolios benefit from being grounded in something more virtual-studio-like (how much are they doing that already)?
  • How big can a virtual studio be before it loses the community feeling? Is there a way to scale community?

Place Based Learning (read by Michael Marcinkowski)

While the article on place based learning only provided a surface view of the approach, I found it very interesting in two distinct ways.

First, it focused on place based learning as not being solely the province of lessons conducted in the field, away from the classroom. What was highlighted in the article was the way that place based learning could just as easily take place in the classroom with students studying their local communities or local histories from their desks. Whether in the classroom or the field, the focus is on how students are able to make robust connections between their personal situation and their learning.

This kind of connection between the learner and their local community provides the foundation for the second point of interest in the article: that place-based learning can easily incorporate aspects of critical pedagogy. As students explore their local communities, they can both explore critical issues facing the community and build on their own experiences in order to support their learning. One example that was noted was having students explore the function of public transportation networks in their community, looking at questions of availability, accommodation, and planning.

An important development in place based learning has been the rise in the ubiquity of smartphones and other location-aware devices. By tapping into GPS and other forms of location networks, it becomes possible to develop applications that allow learners to dynamically access information about their surroundings. The article mentions one project that allows language learners to access vocabulary specific to the locations in which it is used, for instance, having transit based vocabulary guides triggered near bus stops. The idea is that such systems allow for the in-situ acquisition of vocabulary in a way which is both useful in the moment and that reinforces learning.

There are already a number of good examples of place based learning that have been developed out of the University of Bristol, including the Bristol Futures Course which encourages students to explore and engage with the wider city of Bristol and the Romantic Bristol smartphone app which highlights places of historic and literary importance around the city.

Particularly as the University begins to confront its legacy of involvement with the slave trade, there look to be a number of ways in which place based education can continue to be fostered among the University community.

Roots of Empathy (read by Chrysanthi Tseloudi)

This section describes a classroom programme that aims to teach children empathy, so they can have healthy and constructive social interactions.

In this programme, children between 5-13 years old get visits in their school class every 3 weeks from a local baby, their parent and a Roots of Empathy instructor. The children observe how the baby and its feelings develop and its interactions with the parent. With the guidance of the instructor, the children learn about infant development and identify the baby’s feelings, their own and those of others; they then reflect on them, describe and explain them. There are opportunities for discussion and other activities, including the children recording songs for their baby and reflecting on what they would like the baby’s future to be like. The curriculum is broken down into themes, which are then broken down further into age ranges. While the activities focus on feelings, some use knowledge and skills from school subjects, e.g. mathematics. Research on the programme has shown positive results in decreasing aggression and increasing positive social behaviours.

It was interesting to read about this approach. Something that stood out for me was that while the learners identifying their own feelings is mentioned, it is not obvious if this is an explicit aim of this programme. That made me wonder whether it is assumed that a person that is able to identify others’ feelings is definitely able to identify their own (in which case this programme addresses this skill implicitly), whether it is assumed that the children are able to do this already or whether knowing one’s own feelings is not considered an important skill in healthy social interactions. I also wondered how children that have significant difficulties identifying their own or others’ feelings fare in this programme and if/ how they are further supported.

JISC Horizon Report on wellbeing and mental health – notes from reading group

Suzi read the first section of the JISC Horizon Report mental health and wellbeing section. This talked about the increasing demands on mental health services and discussed some possible causes including worries about money and future prospects, diet, use of social media, and reduced stigma around talking about mental health.

Many institutions are increasing their efforts around student wellbeing. The report mentioned a new task force looking at the transition to university and support in first year: Education Transitions Network.

Four technologies are mentioned as currently being used within HE:

  • Learning analytics to identify students in need of being checked in on
  • Apps and online mood diaries, online counselling
  • Peer support (overseen by counsellors) via Big White Wall
  • Chatbots

The report didn’t have a great amount of detail on how these are used. Using learning analytics to see who’s not engaging with online content seems like the simplest application and is doable in many systems but even this would require care. Revealing that you are keeping students under surveillance in a way they might not expect may cause them to lose trust in the institution and retreat further (or game the system to avoid interventions). Then again, maybe it’s just helping us know the sort of things a student might expect us to know. Universities can be quite disjointed – in a way that may not seem natural or helpful to students. Analytics could provide much needed synaptic connections.

It also struck me that using technology to support wellbeing (and even mental health) is in some ways similar to teaching: what you’re trying to achieve is not simple to define and open to debate.

Johannes read the blog post Learning Analytics as a tool for supporting student wellbeing and watched a presentation by Samantha Ahern. Samantha Ahern is a Data Scientist at the UCL and does research concerning the implications of Learning Analytics on student wellbeing.

In her presentation, she outlined the current problem of the HE Sector with student wellbeing and provided some alarming numbers about the increase of reported mental disorders of young adults (16 –24 years old). According to the NHS survey on mental health in the UK, around 8% of male and 9% of female participants had diagnosed mental health issues in the year 1992. This numbers increased to more than 19% of males and even 26% of females in 2014. Interestingly, females are much more likely to report mental health issues than males, who, unfortunately, are the ones doing most harm to themselves.

In her opinion, the HE institutions have a great responsibility to act when it comes to tackling mental health problems. However, not all activities actually support students. She argues, that too many university policies put the onus to act on the student. But the ones that would need help the most, often do not report their problems. Therefore, the universities should take a much more active role and some rethinking needs to take place.

Her main argument is, that although learning analytics is still in its beginnings and it might sound like a scary and complicated topic, it is worth doing research in this field, as it has the capabilities to really improve student wellbeing when it is done correctly.

It was very interesting to read and listen to her arguments, although it was meant to be as an introduction to learning analytics and did not provide any solutions to the issues.

Roger read “AI in Education – Automatic Essay Scoring”, referenced on page 27 of the JISC Horizons report. Is AI ready to give “professors a break” as suggested in a 2013 article from the New York Times referring to work by EdX on development of software which will automatically assign a grade (not feedback) to essays. If so then surely this would improve staff wellbeing?

Fortunately for the Mail Online, who responded to the same edX news in outraged fashion (“College students pulling all-nighters to carefully craft their essays may soon be denied the dignity of having a human being actually grade their work”) it doesn’t seem that this is likely any time soon.

Recent work from 2 Stanford researchers built on previous results from a competition to develop an automatic essay scoring tool, increasing the alignment of the software with human scorers from 81% in the previous competition to 94.5%.  This to me immediately begged the question – but how consistent are human scorers? The article did at least acknowledge this saying “assessment variation between human graders is not something that has been deeply scientifically explored and is more than likely to differ greatly between individuals.”

Apparently the edX system is improving as more schools and Universities get involved so they have more data to work with, but on their website they state it is not currently available as a service.  The article acknowledges the scepticism in some quarters, in particular the work of Les Perelman, and concludes that there is still “a long way to go”.

Chrysanthi read Learning analytics: help or hindrance in the quest for better student mental wellbeing?, which discusses the data learners may want to see about themselves and what should happen if the data suggests they are falling behind.

Learning analytics can detect signs that may indicate that a student is facing mental health issues and/ or may drop out. When using learning analytics to detect these signs, the following issues should be considered:

  • Gather student’s data ethically and focus on the appropriate metrics to see if a student is falling behind and what factors may be contributing to this.
  • Give students a choice about the data they want to see about themselves and their format, especially when there are comparisons with their cohort involved.
  • Support students at risk, bearing in mind they may prefer to be supported by other students or at least members of staff they know.
  • Talk to students about how to better use their data and how to best support them.

Chrysanthi also read the “What does the future hold” section in JISC Horizon Report Mental Health and Wellbeing, which attempts to predict how wellbeing may be handled in the next few years:

  • Within 2 years, students will have a better understanding of their mental health, more agency, increased expectations for university support and will be more likely to disclose their mental health conditions, as they become destigmatised. Institutions will support them by easing transitions to university and providing flexible, bite-sized courses that students can take breaks from. The importance of staff mental health will also be recognised. New apps will attempt to offer mental wellbeing support.
  • In 3-5 years, institutions will manage and facilitate students supporting each other. Students’ and staff wellbeing will be considered in policy and system design, while analytics will be used to warn about circumstances changing. We may see companion robots supporting students’ needs.
  • In 5 years, analytics may include data from the beginning of students’ learning journey all the way to university to better predict risks.

The Horizon group then gives suggestions to help with the wellbeing challenge, including providing guidance, offer education on learning, personal and life skills to students, and regularly consulting the student voice. Next steps will include exploring the possibility of a wellbeing data trust to enable organisations to share sensitive student data with the aim of helping students, of a wellbeing bundle of resources, apps, etc and more work on analytics, their use to help students and staff and the ethical issues involved.

Naomi read ‘Do Online Mental Health Services Improve Help-Seeking for Young People? A Systematic Review’.

This article from 2014 talks about young people using online services to look for help and information surrounding mental health. The review investigates the effectiveness of online services but does state that a lot more research needs to be done within this area. The article flits between the idea of seeking help and self-help and talks about the benefits of both. It mentions how young people now feel they should problem solve for themselves, so providing an online space for them to access useful information is a great way for them to seek help.

The review mentions how ‘only 35% of young people experiencing mental health problems seek professional face to face help’.  This statistic adds to the fact that online services are needed to provide help and assistance to those in need. It does add that young people do have improved mental health literacy and are better at recognising that they or someone they know may need help. With face to face professional help becoming increasingly harder to access more are turning to online information. It has to be said however that online help has no follow up, and young people can often be given information online, with no way to continue gaining assistance.

One interesting part of the article talked about structured and unstructured online treatment programmes. Although effective at reducing depression and anxiety, structured programmes had poor uptakes and high drop outs with no way for help to be maintained. Unstructured programmes are more useful in the sense that the user could select links that appear useful and disregard to information that seems irrelevant.

This article wasn’t student focused and only covered data collected from younger people, but the ideas behind the review are poignant in a higher education background.

Suggested reading

Jisc Horizon Report mental health and wellbeing section

Or investigate / try out one or more of the online services listed here (or any other – these just seem like helpful lists):

Or related articles

Near future teaching – notes from reading group

For our latest reading group, following Sian Bayne’s fascinating Near Future Teaching seminar for BILT, we wanted to look in more depth at the project materials and related reading.

Michael read ‘Using learning analytics to scale the provision of personalized feedback,’ a paper by Abelardo Pardo, Jelena Jovanovic, Shane Dawson, Dragan Gasevic and Negin Mirriahi. Responding to the need to be able to provide individual feedback to large classes of students, this study presented and tested a novel system for utilizing learning analytic data generated by student activity within a learning management system in order to deliver what the authors called ‘personalized’ feedback to students. As it was designed, the system allowed instructors to create small, one or two sentence pieces of feedback for each activity within a course. Based on these, each week students would be able to receive a set of ‘personalized’ feedback that responded to their level of participation. In the study, the authors found an improvement in student satisfaction with the feedback they received, but only a marginal improvement in performance, as compared to previous years. There were limits to the methodology — the study only made use of at most three years of student data for comparison — and the author’s definition of ‘personalized feedback’ seemed in practice to be little more than a kind of customized boilerplate feedback, but nevertheless the study did have a few interesting points. First, it was admirable in the way that it sought to use learning analytics techniques to improve feedback in large courses. Second, the authors took the well thought out step to not make the feedback given to be about the content of the course, but instead it focused on providing feedback on student study habits. That is, the feedback might encourage students to make sure they did all the reading that week if they weren’t doing well, or might encourage them to be sure to review the material if they had already reviewed it all once. Third, the article offered an interesting recounting of the history of the concept of feedback as it moved from focusing only on addressing the gap between targets and actual performance to a more wholistic and continuous relationship between mentor and student.

Suzi read Higher education, unbundling, and the end of the university as we know it by Tristran McCowan. This paper starts with a thorough guide to the language of unbundling and the kinds of things that we talk about when we talk about unbundling, followed by an extensive discussion of what this means for higher education. My impression from the article was that “unbundling” may be slightly unhelpful terminology, partly because it covers a very wide range of things, and partly because – if the article is to be believed – it’s a fairly neutral term for activities which seem to include asset-stripping and declawing universities. As an exploration of the (possible) changing face of universities it’s well worth a read. You can decide for yourself whether students are better off buying an album than creating their own educational mixtape.

Roger read “Future practices”.   For world 1 , human led and closed, I was concerned that lots was only available to “higher paying students” and there was no mention at all of collaborative learning. For world 2, human led and open, I liked the the idea of the new field of “compassion analytics”, which would be good to explore further, lots of challenge based learning and open content. World 3, tech led and closed, was appealing in its emphasis on wellbeing in relation to technology, and a move away from traditional assessment, with failure recognised more as an opportunity to learn, and reflection and the ability to analyse and synthesise prioritised. From world 4 I liked the emphasis on lifelong learning and individual flexibility for students eg to choose their own blocks of learning.

Chrysanthi read Future Teaching trends: Science and Technology. The review analyzes 5 trends:

  • datafication – e.g. monitoring students’ attendance, location, engagement, real-time attention levels,
  • artificial intelligence – e.g. AI tutoring, giving feedback, summarizing discussions and scanning for misconceptions, identifying human emotions and generating its own responses rather than relying only on past experience and data,
  • neuroscience and cognitive enhancement – e.g. brain-computer interfaces, enhancement tools like tech that sends currents to the brain to help with reading and memory or drugs that improve creativity and motivation,
  • virtual and augmented realities – e.g. that help to acquire medical skills for high-risk scenarios without real risk, or explore life as someone else to develop empathy, and
  • new forms of value – enabling e.g. the recording and verification of all educational achievements and accumulation of credit over one’s lifetime, or the creation of direct contracts between student-academic.

I liked it because it gave both pros and cons in a concise way. It allows you to understand why these trends would be useful and could be adopted widely, at the same time as you are getting a glimpse of the dystopian learning environment they could create if used before ethical and other implications have been considered.

Suggested reading

Feedback, NSS & TEF – notes from reading group

Chrysanthi read “Thanks, but no-thanks for the feedback”. The paper examines how students’ implicit beliefs about the malleability of their intelligence and abilities influence how they respond to, integrate and deliberately act on the feedback they receive. It does so, based on a set of questionnaires completed by 151 students (113 females and 38 males), mainly from social sciences.

Mindset: There are two kinds of mindsets regarding malleability of one’s personal characteristics; People with a growth mindset believe that their abilities can grow through learning and experience; people with a fixed mindset believe they have a fixed amount of intelligence which cannot be significantly developed. “If intelligence is perceived as unchangeable, the meaning of failure is transformed from an action (i failed) to an identity (i am a failure)” (p851).

Attitudes towards feedbackSeveral factors that influence whether a person accepts a piece of feedback – e.g. how reflective it is of their knowledge and whether it is positive or negative – were measured, as well as 2 outcome measures.

Defence mechanisms: Defence mechanisms are useful in situations we perceive as threatening, as they help us control our anxiety and protect ourselves. But if we are very defensive, we are less able to perceive the information we receive accurately, which can be counterproductive; e.g. a student may focus on who has done worse, to restore their self-esteem, rather than who has done better, which can be a learning opportunity.

The results of the questionnaires measuring the above showed that more students had a fixed mindset (86) than growth (65) and that their mindset indeed affected how they responded to and acted on feedback.

  • Growth mindset students are more likely to challenge themselves and see the feedback giver as someone who can push them out of their comfort zone in a good way that will help them learn. They are more motivated to change their behaviour in response to the received feedback, engage in developmental activities and use the defence mechanisms considered helpful.
  • Fixed mindset students are also motivated to learn, but they are more likely to go about it in an unhelpful way. They make choices that help protect their self-esteem, rather than learn, they are not as good at using the helpful defence mechanisms, they distort the facts of the feedback or think of an experience as all good or all bad. The authors seemed puzzled by the indication that fixed students are motivated to engage with the feedback, but they do so by reshaping reality or dissociating themselves from the thoughts and feelings surrounding said feedback.

Their recommendations?

  • Academics should be careful in how they deliver highly emotive feedback, even if they don’t have the time to make it good and individualised.
  • Lectures & seminars early in students’ studies, teaching them about feedback’s goal and related theory and practice, as well as action action-orientated interventions (eg coaching), so they learn how to recognize any self-sabotaging behaviours and manage them intelligently.
  • Strategies to help students become more willing to experience – and stay with – the emotional experience of failure. Eg, enhance the curriculum with opportunities for students to take risks, so they become comfortable with both “possibility” and “failure”.

I think trying to change students’ beliefs about the malleability of their intelligence would go a long way. If one believes their abilities are fixed and therefore if they don’t do well, they are a failure, a negative response to feedback is hardly surprising. That said, the responsibility of managing feedback should not fall entirely on the student; it still needs to be constructive, helpful and given in an appropriate manner.

Suzi read: An outsider’s view of subject level TEFA beginner’s guide to the Teaching Excellence FrameworkPolicy Watch: Subject TEF year 2 by the end of which she was not convinced anyone knows what the TEF is or how it will work.

Some useful quotes about TEF 1

Each institution is presented with six metrics, two in each of three categories: Teaching QualityLearning Environment and Student Outcomes and Learning Gain. For each of these measures, they are deemed to be performing well, or less well, against a benchmarked expectation for their student intake.

… and …

Right now, the metrics in TEF are in three categories. Student satisfaction looks at how positive students are with their course, as measured by teaching quality and assessment and feedback responses to the NSS. Continuation includes the proportion of students that continue their studies from year to year, as measured by data collected by the Higher Education Statistics Agency (HESA). And employment outcomes measures what students do (and then earn) after they graduate, as measured by responses to the Destination of Leavers from Higher Education survey – which will soon morph into Graduate Outcomes.

Points of interest re TEF 2

  • Teaching intensity (contact hours) won’t be in the next TEF
  • All subjects will be assessed (at all institutions), with results available in 2021
  • Insufficient data for a subject at an institution could lead to “no award” (so you won’t fail for being too small to measure)
  • Resources will be assessed
  • More focus on longitudinal educational outcomes, not (binary) employment on graduation
  • It takes into account the incoming qualifications of the students (so it does something like the “value add” thing that school rankings do) but some people have expressed concern that it will disincentivise admitting candidates from non-traditional backgrounds.
  • There will be a statutory review of the TEF during 2019 (reporting at the end of the year) which could change anything (including the gold / silver / bronze rankings)

Suzi also read Don’t students deserve a TEF of their own which talks about giving students a way in to play with the data so that, for example, if you’re more interested in graduate career destinations than in assessment & feedback you can pick on that basis (not on the aggregated data). It’s an interesting idea and may well happen but as a prospective student I can’t say I understood myself — or the experience of being at university — well enough for that to be useful. There’s also a good response talking about the kind of things (the library is badly designed, lectures are at hours that don’t make sense because rooms are at a premium, no real module choice) you might find out too late about a university that would not be covered by statistics.

Roger read “How to do well in the National Student Survey (NSS)” an article from Wonkhe,  written in March 2018. The author, Adrian Burgess, Professor of Psychology at Aston University, offers some reflections based on an analysis of NSS results from 2007 to 2016.

Whilst many universities have placed great emphasis on improving assessment and feedback, this has “brought relatively modest rewards in terms of student satisfaction” and remains the area with the lowest satisfaction.

Burgess’ analysis found that the strongest predictors of overall satisfaction were “organisation and management” closely followed by “teaching quality”.

Amy read Feedback is a two-way street. So why does the NSS only look one way?, an article by Naomi Winstone and Edd Pitt. This piece highlighted the issue that the NSS questions on feedback are framed as if feedback should be a passive experience – that students should be given their feedback. In 2017, the question was changed from “I have received detailed comments” to “I have received useful comments”. Both the old and new question frames feedback as something that is received, a ‘transmission-focussed mindset’, whereas Winstone and Pitt argue that feedback should be a two-way relationship – with the student working with the feedback and their tutor to develop.
The authors do not believe that changing the NSS question will solve all of the problems with students perception of feedback (though it will definitely help!) but they do believe that by promoting feedback as something that individuals work with, have responsibility for and seek out if they feel they need to develop in a certain area, that gradually the mindset will change and become a more sustainable form of learning for students.

Suggested reading

From WonkHE

From the last time we did assessment & feedback, which was July 2017 (I’ve left in who read what then)

Accessibility, inclusivity, universal design – notes from the reading group

Naomi looked at the Accessibility of e-learning OU Course, and read the 10 key points from the UCL Blog

The summary comments written by Jessica Gramp summed up the OU course and gave a good overview of what the course covered, as well as an idea of how wide the disability scope is. It was an interesting read for someone who’s knowledge of accessibility in e-learning is quite limited.

The post gave information on how there are two views of disability. The Medical Model, describes ‘the problem of disability as stemming from the person’s physical or mental limitation.’ And the Social Model, ‘sees disability as society restricting those with impairments in the form of prejudice, inaccessible design, or policies of exclusion.’

The idea of society restricting those with impairments through inaccessible design was interesting, as it is something most people have done, but often give little thought to.  We often like to design things to look ‘pretty’ but give little thought to those using screen readers or think about how we would describe an image for example. What is also mentioned in the post is how accessibility is about both technical and usable access for people with disabilities. Jessica gives the example of a table of data. Although it may be technically accessible for someone who is blind, the meaning of the data would be lost on a screen reader and would no sense and be unusable to the user. The post and course both talk about evaluation accessibility, but for me it’s something that needs to come right at the beginning of the design. There is no point designing something that uses spreadsheets for example if screen readers won’t produce the correct data and meanings to the users.

The last point Jessica makes, which I really liked, was that accessible learning environments help everyone, not just those with disabilities.

“This last point reflects my own preference for listening to academic papers while running or walking to work, when I would be otherwise unable to “read” the paper. As a student and full-time employee, being able to use this time to study enables me to manage my time effectively and merge my fitness routine, with study time. This is only possible because my lecturers, and many journals these days too, provide accessible documents that can be read out loud using my mobile smartphone.” – Jessica Gramp

A thought-provoking blog post that gave me a lot to think about and made me put more thought into the work I create online.

Whilst reading this I also came across on article on Twitter from Durham’s student paper The Palatinate. This talks about how Durham University have introduced lecture capture to their lectures. However, the English department have opted out, citing changes to the teaching relationships, and a ‘lack of credible evidence that lecture capture improves academics attainment.’ In the departments’ email, they talk about the ‘danger of falling attendance, and the potential compromise of the classroom as a safe place, where controversial material can be discussed.’

These are all good points, but the writer of the article points out that accessibility needs may be more important than these factors. With such a wide range of disabilities, lecture capture could provide help in lectures to those that need it. The question also needs to be answered that if they aren’t going to use lecture capture, what are they doing to help their students with disabilities?

It was an interesting article that makes us think about how much accessibility weighs in within teaching and learning. It should be at front of what we are thinking when we first start designing how we are going to teach, or present data. But there is often a stigma and it can also cause tensions and challenges. Going forward, these need to be addressed, rather than be ignored.

Suzi read Applying Universal Design for Learning (UDL) Principles to VLE design from the UCL blog. A short, but very thorough and clear post, written as part of UCL’s Accessible Moodle project. For the main part this is, reassuringly enough, a re-framing of things we know make for good accessible web design (resizing text, designing for screen readers, etc). However, it did include the following:

“The VLE should also offer the ability to customise the interface, in terms of re-ordering frequently accessed items, placement of menus and temporarily hiding extraneous information that may distract from the task at hand.”

Not suggestions I have seen before in an accessibility context, possibly because they are more difficult to implement. In particular, the idea of limiting distracting information – that being an accessibility issue – seems obvious once it’s been said. It’s something that would be welcome for a wide range of our students and staff.

Suzi also read Advice for making events and presentations accessible from GOV.UK. Again this is very clear, straightforward advice, well worth being aware. The advice is for face-to-face events but covers points on supporting a partially remote audience. Some of the points that I had not thought of included:

  • Ask your participants an open question about their requirements well before the event. Their wording is “Is there anything we can do to enable you to be able to fully participate in this event?”
  • Don’t use white slide backgrounds because of the glare. For example, GOV.UK slide decks use black text on grey or white text on dark blue.
  • Give audio or text descriptions of any video in your presentation.

There are also some interesting suggestions in the comments. I found the comments particularly interesting as they seem to be individuals speaking directly about their own needs (or possibly those of people they work with) and what they would find most useful. Suggestions include ensuring there is good 3G or 4G coverage, as wifi might not be enough to support assistive technologies, and opening with a roll call (because as a blind person you can’t just glance around the room to see who is there). One commenter suggests you should always sing the key points from your presentation (to an existing tune, no need to compose especially) – an idea I love but not one I’m up to implementing.

Chrysanthi watched 2 videos from the list 15 inspiring inclusive design talks:

When we design for disability, we all benefit | Elise Roy

In this talk, Elise Roy gives examples of inventions that were initially inspired by/ for people with disabilities, but turned out to be useful for people without as well. These include:

  1. Safety glasses that visually alert the user about changes in pitch coming from a tool (which can mean the tool will kick back) before the human ear can pick it up (theirs).
  2. A potato peeler that was designed for people with arthritis but was so comfortable that others used it.
  3. Text messaging, which was originally conceived for deaf people.

Her suggestion is to design for people with disabilities first, rather than the norm. This could mean that the solution is not only inclusive, but potentially better, than if it was designed for the norm. So rather than “accommodate” people with disabilities, use that energy to come up with innovative solutions that are beneficial to all.

Derek Featherstone: Accessibility is a Design Tool

Derek Featherstone makes a similar point to Elise Roy, that designing for accessibility can help everyone. Looking at how outliers/ people at the ends of a spectrum will be influenced by a design decision can also help understand how the average person will be affected. “If we look at the extremes, everybody else is going to be somewhere in the middle”. Between no vision and perfect vision, between no hearing and perfect hearing etc.

The main points to consider for accessibility as a design tool:

  1. People with disabilities may have needs for specific types of content, on top of the content everyone else gets, in order to make decisions: e.g. to choose a health provider, they don’t just need to know how far away the provider is, but perhaps where the wheelchair ramp is at the practice, as that might affect whether they choose to go to this one or choose a different one. Designers should find out what kind of extra content they need. Other examples: Are there captions for this film I am considering watching?
  2. When trying to make something accessible, it is important to consider why it is included in the first place, rather than just what it is. That could be the difference between providing a confusing textual description of an element, and a clear one of how the information the element portrays affects the people accessing it. E.g. instead of trying to textually describe a change of boundaries on a map, give someone the ability to look up their post code and see if they are affected by that change.
  3. Proximity; this known design principle of grouping related items together (e.g. images to their textual explanations, instructions to the parts they refer to etc) is even more important for people with certain types of disability, like low vision. This is because it is much easier for them to lose the context, as they see much less of the interface at a time. Derek suggests getting an understanding of this by examining an interface looking at it through your fist, like holding a straw. Actions, buttons etc should be placed in a way that the desired action is located where the person would expect according to the patterns of use that have been established already. If so far, the action is on a specific part of the screen, changing that will be confusing. Buttons should be distinguishable from each other even without reading, so e.g. for buttons previous & next, using the exact same colours, font, sizes, etc means the user needs to read to distinguish.

Finally, it is important to not get so caught up in the technical requirements of making something accessible on paper, that we forget what it is we are trying to achieve.

Suzanne read New regulations for online learning accessibility (WonkHE, 23 Sept 2018)

Published in WonkHe in September 2018, this article by Robert McLaren outlines the new regulations for online learning accessibility. McLaren works for the think-tank Policy Connect, which published a report in collaboration with Blackboard Ally after the government ratified the EU Web Accessibility Directive on the 23rd of September. This directive clarifies the position of HE institutions as public sector bodies and thus includes them in the requirements for web accessibility. This means that VLEs, online documents, video recordings etc are all counted as web content, and need to meet the four principles of accessible web design: that it is perceivable, operable, understandable, and robust. Additionally, VLEs will also have to include an accessibility statement outlining the accessibility compliance of the content, directing students to tools to help them get the most from the content (such as browser plugins), and explaining how students can flag any inaccessible content. As McLaren notes, this has long been considered good practice, and isn’t really anything new, but is now a legal duty.

The article then outlines several areas which may still need addressing in VLE content. The first is ensuring content is usable. The example he uses is the prevalence of scanned pdfs which are hard or impossible to work with (as they appear as image, rather than text) for disabled students, but also for non-disabled students and those working with mobile devices. From this point, McLaren moves to discuss briefly the idea of universal design, which he defines as “educational practice that removes the barriers faced by disabled students and thereby benefits all students.” In the article, he claims that the rise in universal design has in part been fuelled by cuts to Disabled Students Allowances and the increasing shift in focus to universities to remove barriers for disabled students rather than DSA and other measures which work to mitigate these barriers once they are in place.

The article then suggests a model for ensuring the change required to meet these needs: “We recommended a cascading approach. Government should work with sector organisations to provide training for key staff such as learning technologists, who can in turn train and produce guidance for teaching staff.” As the report was sponsored by Blackboard Ally, it is perhaps not surprising that another side of their solution is to provide a range of usable and flexible resources, which Ally helps users ensure they are providing. The final remarks, however, surely stand true no matter how achieved (through Ally or other means): “An inclusive approach allows all students to learn in the ways that suit them best. If the sector can respond effectively to these regulations, all students, disabled and non-disabled, will benefit from a better learning experience.”

Suggested reading

Online communities – notes from the reading group

Amy read Professors share ideas for building community in online courses. The over-arching narrative of this piece was that ‘humanizing learning’ was the most effective was to build online learning communities, which occurs when students connect on a emotional and social level when engaging with the community. The author, Sharon O’Malley, suggest six methods for achieving this:

  1. Let students get to know you – instructors need to present themselves as ‘real people’ – this can be done by appearing goofy or telling genuine anecdotes in videos, for example. Students should also be encouraged to reveal their non-academic lives, in order for others to feel more like they know them personally, rather than just in the learning context
  2. Incorporating video and audio resources and feedback
  3. Meet in real time – students can talk to each other in real time and make instant connections
  4. Work in small groups – students get connected with others in their group – instead of feeling like they’re in a class of fifty, they feel they are in a class of 5, 10 etc.
  5. Require constant interaction – group projects and collaborative writing assignments force students to engage with each other out of the session
  6. Rise to the challenge – building community takes time – it takes planning and experimentation. Stick with it if it doesn’t immediately work!

Roger introduced a Building learning communities card activity. This is an activity from QAA Scotland, designed to stimulate discussion about what helps an effective learning community. The activity cards suggest the following factors:

  • Clearly defined and inclusive values
  • A clearly articulated and shared purpose
  • Clearly articulated and shared purpose goals
  • Active and vibrant interaction
  • Owned and managed by its people
  • Dedicated structure
  • Collaboration
  • Adequate and appropriate support
  • Understood and respected expectations
  • Adequate and appropriate resources
  • Built in evaluation

The instructions ask the group to consider which of these are essential and which are “nice to haves”.   The activity was certainly effective in stimulating discussion in reading group.]

Suzi watched Building Community: A Conversation with Dave Cormier – a recording of an edX webinar from 2014 – video. Here Cormier, who coined the term MOOC, talks to edX about how they could and should use online learning communities.

Cormier talks about four models of learning that you could scale up online:

  • One-to-one (adaptive learning, tutoring on skype?)
  • One-to-many (video lectures on MOOCs)
  • Cooperative learning: many-to-many, all working on the same thing
  • Collaborative learning: many-to-many, shared interest but each with own project

Collaborative learning is the one which he thinks is particularly – perhaps only – served by online communities. The real life equivalent being chaos, or maybe conferences (which, arguably, don’t work well for learning).

He draws the distinction between mastery learning (where skills can be ticked off a list as you progress) and complexity. Communities are not a particularly useful tool for mastery, or for checking who has learnt what. They are much better suited for complexity. This seemed to echo discussions we’d had about the difference between using gamification and using playfulness in learning – gamification being more for mastery, playfulness for complexity.

Cormier offers some tips on building a successful community.

  • A community should have, should move people towards building, shared values and a shared language.
  • Drive participation by having a famous person (but this can become one-to-many) or by asking annoying questions that people can’t resist engaging with (eg “how do we recognise cheating as a valuable part of education?”).
  • Shape participation by assigning roles to people and having course leader presence to set the tone.
  • Give people ways to get to know each other and make connections: recognising who people are and recognising aspects of yourself in them.

His view on evaluation and measuring success might be more specific to the MOOC context. He suggests borrowing techniques from advertising to demonstrate their value (but he doesn’t give details). The outcomes he suggests you might hope for are things like building more interest in your research area, or building the brand of an academic / department / institution.

He also asks some interesting questions. About the authenticity of work we give to students – how will their work persist? Can it be right that so much of students work is destined to be thrown away? About life beyond the community – how will the community persist? Communities are emotional – you shouldn’t just pull the plug at the end.

Lots of this is challenging in an educational context. For instance, communities take time to build but we generally work with units that last for a teaching block at most. Our online Bristol Futures courses only last four weeks. I wonder if this is to do with setting expectations. Perhaps we need thin and thick communities: the thin communities being time-bound but with much more scaffolding and a narrower purpose, the thick communities being more what Cormier is talking about here.

I also read The year we wanted the internet to be smaller (on the growth of niche communities in 2017) and 11 tips for building an engaged online community (practical advice aimed at NGOs). Both are interesting in their own right and worth a read. In both the idea of shared values, shared language and a sense of purpose came up. They also talk also recognition: communities as a place where you find “your people”. This resonates with my positive experiences of online communities but is, again, challenging in an education context. As Suzanne pointed out I think – if the tone and being among “your people” is important you must be able to walk out and find something different if you don’t feel comfortable. And it may be far better that you work with people who aren’t just  “your people”, or at least who don’t start that way.

Suggested reading

Online communities in education

From other sectors

Education communities – articles that are 10+ years old

Suggested listening

Miscellany – notes from the reading group

No theme this month – just free choice. Here’s what we read (full notes below):

Naomi read Stakeholders perspectives on graphical tools for visualising student assessment and feedback data.

This paper from the University of Plymouth looks at the development and progression of learning analytics within Higher Education. Luciana Dalla Valle, Julian Stander, Karen Gretsey, John Eales, and Yinghui Wei all contributed.  It covers how four graphical visualisation methods can be used by different stakeholders to interpret assessment and feedback data. The different stakeholders being made up of external examiners, learning developers, industrialists (employers), academics and students.

The paper discusses how there is often difficulty pulling information from assessments and feedback as there can be a lot of data to cover. Having graphic visualisations means information can be shared and disseminated quickly, as there is one focal point to concentrate on. Its mentioned that some can include ‘too much information that can be difficult for teachers to analyse when limited time is available.’ But it is also discussed how it is important then to evaluate the visualisations from the point of view of the different stakeholder who may be using them.

The paper looks at how learning analytics can be seen as a way to optimise learning and allow stakeholders to fully understand and take on board the information that they are provided with. For students it was seen as a way to get the most out of their learning whilst also flagging student’s facing difficulties. The paper also talks about how it brings many benefits to students who are described as the ‘overlooked middle’. Students are able to easily compare their assessments, attainment, and feedback to see their progression. Student’s agreed that the visualisations could assist with study organisation and module choice, and it’s also suggested taking these analytics into account can improve social and cultural skills. For external examiners, analytics was seen as a real step forward in their learning and development. For them it was a quick way to assimilate information and improve their ‘knowledge, skills and judgement in Higher Education Assessment. Having to judge and compare academics standards over a diverse range of assessment types is difficult and visual graphics bring some certain simplicity to this. For learning developers too, using these images and graphics are suggested to help in ‘disseminating good practice.

The paper goes on to explain how it does improve each of the stakeholder’s evaluation of assessment. It goes into a lot of detail of the different visualisations suggested, commenting on their benefits and drawbacks of each set of data, which is worth taking a more detailed look at. It should also be noted that the paper suggested there could be confidential or data protection issues involved with sharing or releasing data such as this as in most cases this certain data is only seen at faculty or school level. Student demoralisation is also mentioned near the end of the paper as being a contributing factor to why these graphics may not always work in the best ways. It finishes by suggesting how it would be interesting to study student’s confidence and self-esteem changes due to assessment data sharing. It an interesting idea that needs to be carefully thought out and analysed to ensure it produces a positive and constructive result for all involved.

Suzanne read: Social media as a student response system: new evidence on learning impact

This paper begins with the assertion that social media is potentially a “powerful tool in higher education” due to its ubiquitous nature in today’s society. However, also recognising that to date the role of social media in education has been a difficult one to pin down. There have been studies showing that it can both enhance learning and teaching and be a severe distraction for students in the classroom.

The study sets out to answer these two questions:

  • What encourages students to actively utilise social media in their learning process?
  • What pedagogical advantages are offered by social media in enhancing students’ learning experiences?

To look at these questions, the researchers used Twitter in a lecture-based setting with 150 accounting undergraduates at an Australian university. In the lectures, Twitter could be used in two ways: as a ‘backchannel’ during the lecture, and as a quiz tool. As a quiz tool, the students used a specific hashtag to Tweet their answers to questions posed by the lecturer in regular intervals during the session, related to the content that had just been covered. These lectures were also recorded, and a proportion of the students only watched the recorded lecture as they were unable to attend in person. Twitter was used for two main reasons. First, the researchers assumed that many students would already be familiar and comfortable with Twitter. Secondly, using Twitter wouldn’t need any additional tools, such as clickers, or software (assuming that students already had it on their devices).

Relatively early on, several drawbacks to using Twitter were noted. There was an immediate (and perhaps not surprising?) tension between the students and lecturers public and private personas on Twitter. Some students weren’t comfortable Tweeting from their own personal accounts, and the researchers actually recommended that lecturers made new accounts to keep their ‘teaching’ life separate from their private lives. There was also a concern about the unpredictability of tapping into students social media, in that the lecturer had no control over what the students wrote, in such a public setting. It also turned out (again, perhaps not unsurprisingly?) that not all students liked or used Twitter, and some were quite against it. Finally, it was noted that once students were on Twitter, it was extremely easy for them to get distracted.

In short, the main findings were that the students on the whole liked and used Twitter for the quiz breaks during the lecture. Students self-reported being more focused, and that the quiz breaks made the lecture more active and helped with their learning as they could check their understanding as they went. This was true for students who actively used Twitter in the lecture, those who didn’t use Twitter but were still in the lecture in person, and those who watched the online recording only. During the study, very few students used Twitter as a backchannel tool, instead preferring to ask questions by raising a hand, or in breaks or after the lecture.

Overall, I feel that this supports the idea that active learning in lectures is enhanced when students are able to interact with the material presented and the lecturer. Breaking up content and allowing students to check their understanding is a well-known and pedagogically sound approach. However, this study doesn’t really provide any benefit in using Twitter, or social media, specifically. The fact that students saw the same benefit regardless of whether they used Twitter to participate, or were just watching the recording (where they paused the recording to answer the questions themselves before continuing to the answers), seems to back this up. In fact, in not using Twitter in any kind of ‘social’ way, and trying to hive off a private space for lecturers and students to interact in such a public environment seems to be missing the point of social media altogether. For me, the initial research questions therefore remain unanswered!

Suzi read Getting things done in large organisations

I ended up with a lot to say about this so I’ve put it in a separate blog post: What can an ed techie learn from the US civil service?. Key points for me were:

  • “Influence without authority as a job description”
  • Having more of a personal agenda, and preparing for what I would say if I got 15 minutes with the VC.
  • Various pieces of good advice for working effectively with other people.

Chrysanthi read Gamification in e-mental health: Development of a digital intervention addressing severe mental illness and metabolic syndrome (2017). This paper talks about the design of a gamified mobile app that aims to help people with severe chronic mental illness in combination with metabolic syndrome. While the target group is quite niche, I love the fact that gamification is used in a context that considers the complexity of the wellbeing domain and the interaction between mental and physical wellbeing. The resulting application, MetaMood, is essentially the digital version of an existing 8-week long paper-based program with the addition of game elements. The gamification aims to increase participation, motivation and engagement with the intervention. It is designed to be used as part of a blended care approach, combined with face to face consultations. The game elements include a storyline, a helpful character, achievements, coins and a chat room, for the social element. Gamification techniques (tutorial, quest, action) were mapped to traditional techniques (lesson, task, question) to create the app.

The specific needs of the target group needed the contributions of an interdisciplinary team, as well as relevant game features; eg the chat room includes not only profanity filter, but also automatic intervention when keywords like suicide are used (informing the player of various resources available to help in these cases). Scenarios, situations and names were evaluated for their potential to trigger patients, and changes were made accordingly; eg the religious sounding name of a village was changed, as it could have triggered delusions.

The 4 clinicians that reviewed the app said it can proceed to clinical trial with no requirement for further revision. Most would recommend it to at least some of their clients. The content was viewed as acceptable and targeted by most, the app interesting, fun & easy to use. I wish there had been results of the clinical trial, but it looks like this is the next step.

Roger read “Analytics for learning design: A layered framework and tools”, an article from the British Journal of Educational Technology.

This paper explores the role analytics can play in supporting learning design. The authors propose a framework called the “Analytics layers for learning design (AL4LD)”, which has three layers: learner, design and community analytics.

Types of learner metrics include engagement, progression and student satisfaction while experiencing a learning design. Examples of data sources are VLEs or other digital learning environments, student information systems, sensor based information collected from physical spaces, and “Institutional student information and evaluation (assessment and satisfaction) systems”. The article doesn’t go into detail about the latter, for example to explore and address the generic nature of many evaluations eg NSS, which is unlikely to provide meaningful data about impact of specific learning designs.

Design metrics capture design decisions prior to the implementation of the design. Examples of data include learning outcomes, activities and tools used to support these. The article emphasises that “Data collection in this layer is greatly simplified when the design tools are software systems”. I would go further and suggest that it is pretty much impossible to collect this data without such a system, not least as it requires practitioners to be explicit about these decisions, which otherwise often remain hidden.

Community metrics are around “patterns of design activity within a community of teachers and related stakeholders”, which could be within or across institutions. Examples of data include types of learning design tools used and popular designs in certain contexts. These may be shared in virtual or physical spaces to raise awareness and encourage reflection.

The layers inter-connect eg learning analytics could contribute to community analytics by providing evidence for the effectiveness of a design. The article goes on to describe four examples. I was particularly interested in the third third one which describes the “experimental Educational Design Studio” from the University of Technology Sydney. It is a physical space where teachers can go to explore and make designs, ie it also addresses the community analytics layer in a shared physical space.

This was an interesting read, but in general I think the main challenge is collection of data in the design and community aspects. For example Diana Laurillard has been working on systems to do this for many years, but there seems to have been little traction. eg The learning design support environment  and the Pedagogical Patterns Collector.

Amy read: Addressing cheating in e-assessment using student authentication and authorship checking systems: teachers’ perspectives. Student authentication and authorship systems are becoming increasingly well-used in universities across the world, with many believing that cheating is on the rise across a range of assessments. This paper looks at two universities (University A in Turkey and University B in Bulgaria) who have implemented the TeSLA system (an Adaptive Trust-based eAssessment System for Learning). The paper doesn’t review the effectiveness of the TeSLA system, but rather the views of the teachers on whether the system will affect the amount of cheating taking place.

The research’s main aim is to explore the basic rationale for the use of student authentication and authorship systems, and within that, to look at four specific issues:

  1. How concerned are teaching about the issue of cheating and plagiarism in their courses?
  2. What cheating and plagiarism have teachers observed?
  3. If eAssessment were introduced in their courses, what impact do the teaching think it might have on cheating and plagiarism?
  4. How do teachers view the possible use of student authentication and authorship checking systems, and how well would such systems fit with their present and potential future assessment practises?

Data was collected across three different teaching environments: face-to-face teaching, distance learning and blended learning. Data was collected via questionnaires and interviews with staff and students.

The findings, for the most part, were not hugely surprising: the main type of cheating that took place at both universities was plagiarism, followed by ghost-writing (or the use of ‘essay mills’). These were the most common methods of cheating in both exam rooms and online. The difference between the reasons staff believed students cheated and why students cheated varied widely too. Both teachers and students believed that:

  • Students wanted to get higher grades
  • The internet encourages cheating and plagiarism, and makes it easy to do so
  • There would not be any serious consequences if cheating and plagiarism was discovered

However, teachers also believed that students were lazy and wanted to take the easy way out, whereas students blamed pressure from their parents and the fact they had jobs as well as studying for reasons.

Overall, staff were concerned with cheating, and believed it was a widespread and serious problem. The most common and widespread problem was plagiarism and ghost writing, followed by copying and communicating with others during assessments. When asked about ways of preventing cheating and plagiarism, teaching were most likely to recommend changes to educational approaches, followed by assessment design, technology and sanctions. Teachers across the different teaching environments (face-to-face, blended and distance learning) were all concerned with the increase in cheating that may take place with online/ eAssessments. This was especially the case for staff who taught on distance learning courses, where students currently take an exam under strict conditions. Finally, all staff believed that the use of student authentication and authorship tools enabled greater flexibility in access for those who found it difficult to travel, as well as in forms of assessment. However, staff believed that cheating could still take place regardless of these systems, but that technology could be used in conjunction with other tools and methods to reduce cheating in online assessments.

What can an ed techie learn from the US civil service?

I recently read Getting things done in large organisations by Thomas Kalil (profession: “expert” according to Google). Kalil worked for the Clinton and Obama administrations on science and technology policy. This is his attempt to share what worked for him. I was interested because 12 – nearly 13 – years in at Bristol and I’m still learning how to get things done. From what I understand of their structure and rate-of-change, civil service and universities are at least in some ways similar.

The paper is aimed at “policy entrepreneurs”: people who generate or spot new ideas, then evaluate and (if appropriate) help make them happen. I grew up in the 80’s and the word entrepreneur brings to mind Rodney from Only Fools and Horses … I can’t imagine wanting to apply the term to myself. But the principle certainly applies within my role, and indeed many professional roles.

Kalil starts by giving a bit of a career history, which is probably only relevant if you would like to become a policy advisor. This is the first 6 pages. The remaining 10 pages are pretty solidly filled with good advice. Here are some of the things most directly applicable to my role in digital education….

“Influence without authority as a job description” – this resonates, working in an organisation that is still often operating on goodwill and people’s desire to cooperate

Thought experiment: What if you had 15 minutes with the president (in my case the VC), and if he liked your idea he would be willing to call anyone to make it happen. Kalil developed this as a way of making people think seriously about what they would change, and who would be in a position to do it. Follow up questions:

  • Would the people called be willing & able to do it?
  • Is there anything we could change (for them or about the proposal) to make them more willing or more able?
  • What existing forums / mechanisms are there that could carry this forward? (This also relates to the paper from a previous reading group on evidence and the question: would the initiative continue if we walked away?)

There’s something empowering about having an answer to this prepared. I don’t, but I will.

How does your remit fit into the bigger picture? Related to the thought experiment above is the importance of keeping aware of – and actively looking for – areas where digital education can further the broader aims of the university.

Partnership working (working collaboratively) works best when you have good relationships. Both sides need to:

  • Understand each other’s priorities
  • Trust each other to follow-through
  • Feel able to disagree and raise concerns

Relationships need to be a two-way street, not just one side dictating. It’s also important to understand the internal politics and personalities you are working with. Clearly the bulk of this is good advice for all relationships, professional and personal.

Have an agenda. In my experience teams do tend to do this but, for personal job satisfaction at the very least, having a personal agenda makes some sense. Kalil has some good questions to ask on this (go read them) but key for me is: why do I believe this is the right thing to do and that it will work? Also, recognise that you won’t know the answers without listening to (and asking interesting questions of) other people.

Make it easy for other people to help you. The example Kalil gives is: if you want someone to send an email, write it for them. Closely related to this is his advice for making follow-up more likely to happen: ask people if they think they can complete their assignment; document and follow up commitments; have deadlines, even if they’re artificial; if someone isn’t following up, try to find out why.

Understand what tools are available to you. What are the things you / your team / the organisation can do to affect change?

Be open to ideas from a range of sources. Engage with people from outside of your own contexts. Adapt and imitate what works elsewhere.

Plan for a change in administration (surprisingly applicable in universities, this one).

And some don’ts:

  • Don’t try to do too many things at once
  • Don’t act of the urgent and forget the important
  • Don’t spend too much time on reports
  • Don’t let things drag on indefinitely
  • Don’t surprise people, they don’t like it

Nothing earth-shattering perhaps but good solid advice, much of which it’s worth being reminded of. I’d recommend it.

Wellbeing – notes from the reading group

Roger read: Curriculum design for wellbeing.

This is part of an online professional development course for academics, produced by a project run by a number of Australian Universities co-ordinated by the University of Melbourne. It aims  to build the capacity of University educators to design curriculum and create teaching and learning environments that enhance student mental wellbeing. There are 5 modules: on student wellbeing, curriculum design, teaching strategies, difficult conversations and your wellbeing.

I focussed on module 2 which is on curriculum design. It starts by stressing the importance of students, through the curriculum, experiencing autonomous motivation, a sense of belonging, positive relationships,  feelings of autonomy and competence (M-BRAC). All of these are aspects of good practice in curriculum design.

It goes on to consider how elements of curriculum design support student mental wellbeing, covering alignment, organisation and sequencing of content, engaging learning activities and a focus on assessment for learning.

For example, aligning ILOS with assessment and learning activities helps student autonomy as they understand how what they are doing contributes to their goals and they develop their knowledge and skills, including self-regulation, to achieve the ILOS. Assessment for learning plays a key role here. Clear organisation and sequencing of content contribute to effective learning. Both alignment and structure help to build students’ sense of competence.  Engaging and meaningful learning activities can increase student motivation and encourage peer interaction, which can contribute to building relationships and a sense of belonging.

It suggests that when reviewing the curriculum one should ask:

  • How will the curriculum be experienced by my (diverse) students eg international students, mature students, “first in family” students?
  • Will the curriculum foster or thwart experiences of M-BRAC

The module then has a number of FAQs you are asked to consider, with suggested answers. These were really useful as they tease out some of the complexities, for example “Is setting group assignments in the first year a good way of helping students develop positive peer relationships, and feel a sense of connectedness or belonging?”  Here they recognise that if not well-designed or if students are not supported to develop group work skills it can have a negative impact.

The module ends with a set of case studies illustrating how curricula have been re-designed to better support M-BRAC.

Amy read: Approach your laptop mindfully to avoid digital overload.

This was a short article that recognised the ever-increasing belief that we are being constantly bombarded with masses of new information which, in turn, means that many are suffering with stress-related diseases, anxiety and depression. The reliance on digital devices to provide constant streams of information in the form of news articles, social media feeds and messages mean that without these devices we feel lost without them. A full digital detox is suggested at the beginning of this article, though this may be a short-term solution and often an impractical one.

The authors of this article suggest introducing the practice of mindfulness into our lives to combat this. They describe mindfulness as ‘a moment-to-moment attention to present experience with a stance of open curiosity’. Mindfulness has been studied extensively by the medical community and has shown to help with stress, anxiety and depression in individuals. One can ‘reprogramme’ their mind to deal with stresses more easily by training it to be more present. The authors suggest two key ways they suggest to introduce mindfulness into our use of digital devices to reduce the pressures they can put on us.

One of the methods they suggest is ‘mindful emailing’, which includes practices such as taking three breaths before responding to a stressful email and also considering the psychological effect that the email will have to the recipients.

The second method they suggest is the mindful use of social media, citing ‘checking our intentions before uploading a feed (post?), being authentic in our communications and  choosing the time we spend on social media, rather than falling into it.’

If you haven’t tried mindfulness before take a look at these tips and short audio meditations.

Chrysanthi read: Designing a product with mental health issues in mind,

This article – true to its title – talks about including technological features that aim to help the vulnerable users. While the examples given are taken from a banking application context, the suggestions can be applied to other contexts. More specifically, the article mentions positive friction and pre-emptive action. Positive friction goes against developers’ usual desire to make everything easier and faster and aims to put some necessary obstacles in the way instead, for users that need it. The example used is allowing certain users with somewhat “impulsive” behaviour to check their recent purchases and confirm that they indeed want them. This would help eg bipolar disorder sufferers, who overspend in their manic phase, often at night, and slip into depression in the morning because of their irreversible mistake. In the specific app, this is still a speculative feature.  Pre-emptive action aims to prevent trouble by anticipating certain events and acting on them, eg perceiving a halt in income and sending a well timed notification to start a conversation so the person doesn’t end in debt (and therefore create more stress for themselves). Also, allowing vulnerable customers to choose their preferred time and form of communication (eg phone might be anxiety inducing or email might seem complex).

In an education context, positive friction could be relevant in cases where students are repeatedly doing things they no longer need to do. This would help when – under the illusion they are still learning – students are focusing on redoing exercises they already know how to do – which might help them feel accomplished but doesn’t add value from some point on – or on consuming more and more content, even when they haven’t actually digested what they have learned so far. It isn’t very clear how this could be applied during exam period, though. Pre-emptive action is perhaps easier to translate in an educational context. Any action (or inaction) that is either outside the student’s usual pattern or outside a successful pattern, might be a conversation starting point, or a trigger for suggestions for alternative ways to handle their studies. Also, allowing them different options to learn and communicate with their professors and peers.

Chrysanthi also read: E-mail at work: A cause for concern? The implications of the new communication technologies for health, wellbeing and productivity at work.

This paper explores the potential negative implications of using email at work. The email features they consider as potentially problematic are: speed, multiple addressability, recordability, processing & routing. Essentially, email as a message that can be instantly transmitted to several people at once, automatically stored, easily altered and different versions of it sent to various people, not all of which necessarily visible on the recipient list. According to the authors, emails may increase stress by increasing workload and interruptions, adding difficulty to interpretation of the message and tone, increasing misunderstandings and groupthink, reducing helpful argumentation and social support, increasing isolation and surveillance – which increases discontent, offering a new ground for bullying and harassment, or hindering the processing of negative feelings. Having established these potential negative implications, the authors point out that more research is needed to understand the optimal ways to use email at work, for effective communication and humane workplaces.

Naomi read: Did a five-day camp without digital devices really boost children’s interpersonal skills?

This article was about a brief study led by Yalda Uhls in Southern California. It studied two groups of pupils ‘who on average spent 4.5 hours a day ‘texting watching TV, and video gaming’. Half of the children were sent on a five-day educational camp in the country side where all technical devices were banned. The other half stayed at school as usual. Emotional and psychological tests were carried out on the students before and after the 5 days were completed.

There was a small amount of evidence to suggest the children who had spent time away from devices improved psychologically over the five days. However, because there were several small problems with the study no firm answers can be taken from it. Its suggested the children who went away for the five days only looked like they improved on the tests because they started at a lower level then the children who stayed at school. It was also suggested that the children who stayed at school tests deteriorated because they were tired from doing a week’s work.

As it suggests in the article, the results of this study weren’t particularly hard hitting but it does raise the question of how much the younger generations are using their devices throughout the day. Uhls does admit in the article that there were shortcomings to the study, but they suggest that these findings relate to the ‘wider context of technology fears’ and hope their paper will be ‘a call to action for research that thoroughly and systematically examines the effects of digital media on children’s social development.’ Although the study needed a more comprehensive approach the ideas behind it are interesting and relate to several issues that we see in everyday life – is it good for us to spend so much time on our devices, or is it integral to how we live now?

Suzi read: Learning in the age of digital distraction

This interview with Adam Gazzaley, a neurologist, is a plug for a book called The Distracted Mind in which he and and research psychologist Larry D Rosen talk about the way our brains are wired influences how we use technology.

They suggest that information-foraging is a development of our evolved food-foraging mechanisms, and so is to some extent driven by our very basic drive to survive. Because of this it is hard to prevent it from distracting us from our ability to set and pursue higher-level goals.

Information-foraging doesn’t just impact on people’s ability to focus, it can cause anxiety and stress, and affect mood.

Suggestions for possible ways to combat this include:

  • accepting that we need to (re-)learn to focus (they are also developing brain-training video games)
  • using play in education (but this was only very briefly mentioned I wasn’t clear if this was playfulness or gamification)
  • physical exercise
  • mindfulness
  • sitting down to dinner as a family, or otherwise building in device-free interaction time

Suggested reading