Evidence in teaching – notes from the reading group

Suzi read Why “what works” won’t work: evidence-based practice and the democratic deficit in educational research, Biesta, G 2007 and a chapter by Alberto Masala from the forthcoming book From Personality to Virtue: Essays in the Philosophy of Character, ed Alberto Masala and Jonathan Webber, OUP, 2015

Biesta gives what is broadly an argument against deprofessionalisation in the context of government literacy and numeracy initiatives at primary school level. I found the main argument somewhat unclear. It was most convincing talking about the difficulty in defining what education is for, making it difficult to test whether an intervention has worked. Talks at length about John Dewey and his description of education as a moral practice and learning as reflective experimental problem solving.

“A democratic society is precisely one in which the purpose of education is not given but is a constant topic for discussion and deliberation.”

Masala’s paper is on virtue/character education but is of wider interest as it talks very clearly about educational theory. I found particularly useful in this context the distinction between skill as a competence (defined by performance, so easily testable) and skill as mastery (defined by a search for superior understanding and less easily tested), and the danger of emphasising competence.

Hilary read Version Two: Revising a MOOC on Undergraduate STEM Teaching, which briefly outlined some key approaches and intended developments in a Coursera MOOC aimed at STEM graduates and post docs interested in developing their teaching.

The author of the blog post is Derek Bruff (director of the Vanderbilt University Center for Teaching, and senior lecturer in the Vanderbilt Department of Mathematics with interests in agile learning, social media and SRS – amongst other things: see http://derekbruff.org/)

Two key points:

  1. MOOC centred learning communities – the MOOC adopted a facilitated blended approach, building on the physical groupings of graduate student participants by facilitating 42 learning communities across the US, UK and Australia to use face to face activities to augment the course materials, and improve completion rates.
  2. Red Pill: Blue Pill – adopting the metaphor used by George Siemens in the Data, Learning and Analytics MOOC to give two ways to complete the course – either an instructor-led approach which was more didactic and focussed on the ability to understand and apply a broad spectrum of knowledge OR a student-directed approach which used peer graded assignments and gave the students the opportunity to pick the materials which most interested them, and so focus on gaining a deeper but less comprehensive understanding of the topic.

Final take away – networked learning is hard, as would be the logistics of offering staff / student development opportunities as online and face-to-face modules, with different pathways through the materials, but interesting …

Steve read Building evidence into education, 2013 report by Ben Goldacre for the UK government

Very accessible summary of the case for evidence-based pedagogy in the form of large-scale randomised controlled trials. Compares current ‘anecdote/authority’ edu research with past medical work – lots of interesting analogies. Focused on primary/secondary education but some ideas can transfer to higher – although would be more challenging.

Presents counterarguments to a number of common arguments against the RCT approach – it IS ethical if comparing methods where you don’t know which is best (and if you do know, why bother trialling?!). Difficulty in measuring is not a reason to discount, RCTs are a way to remove noise. Talks about importance of being aware of context and applicability. Uses some good medical examples to illustrate points.

Sketches out an initial framework – teachers don’t need to be research experts (doctors aren’t), should be research-focused team leading and guiding with stats/trials experts etc.

Got me thinking – definitely worth a read.

Roger read “Using technology for teaching and learning in higher education: a critical review of the role of evidence in informing practice, (2014) by Price and Kirkwood

This study explores the extent to which evidence informs teachers’ use of TEL in Higher Education. It involved a literature review, online questionnaire and focus groups. The authors found that there are differing views on what constitutes evidence which reflect differing views on learning and may be characteristic of particular disciplines. As an example they suggest a preference for large-scale quantitative studies in medical education.
In general evidence is under-used by teachers in HE, with staff influenced more by their colleagues and more concerned about what works rather than why. Educational development teams have an important role as mediators of evidence.

This was a very readable and engaging piece, although the conclusions didn’t come as much of a surprise!  The evidence framework they used (page 6) was interesting, with impact categorised as micro (e.g. individual teacher), meso (e.g. within a department) or Macro (across multiple institutions).

Mike read Evidence-based education: is it really that straightforward?, 2013, Marc Smith, Guardian Education response to Ben Goldacre

This is a thoughtful and well argued response to Goldacre’s call for educational research to learn from medical research, particularly in the form of randomised controlled trials. Smith is not against RCTs, but suggests they are not a silver bullet.

Smith applauds the idea that we need teachers to drive the research agenda and that we do need more evidence. His argument that it will be challenging to change the culture of teaching to achieve this, seems valid, but is not necessarily a reason not to try. The thrust of his argument is that  RCTs, whilst effective in medicine, are harder to apply to education due to the complexity of teaching and learning. He believes (and I tend to agree) that cause and effect are harder to determine in the educational context. Smith argues  that in medicine  there is a specific problem (an illness or condition) and a predefined intended outcome (change to that condition). This can be problematic in the medical context, but is even harder to measure in education. I would add that the environment as a whole is harder to control and interventions more difficult to replicate. Different teachers could attempt to deliver the same set of interventions, but actually deliver radically different sessions to learners who will interact with the learning in a variety of ways. Can education be thought of as a change of state caused by an intervention in the same way we would prescribe a drug for a specific ailment?

All this is not to say that RCTs cannot play a role, but that you have to think about what you are trying to research before choosing your methodology (some of the interventions Goldacre addressed related to specific quantitative measurable things like teenage pregnancy rates, or criminal activity). Perhaps it is my social scientist bias, bit I woudl still want to triangulate using a range of methods (quantitative and qualitative).

From a personal perspective, I sometimes think that ideas translated from science to a more social scientific context can lose some scientific validity in the process (though this is maybe most true at the level of theory than scientific practice. For example Dwarkins translated selfish genes into the concept of cultural memes, suggesting cultural traits are transmitted in the same way as genetic code. Malcolm Gladwell’s tipping point is a metaphor from epidemiology which he applies to the spreading of ideas, bringing much metaphorical baggage in the process. Perhaps random control trials could provide better evidence for the validity of these theories too?

53 powerful ideas (well, 4 of them at least) – notes from the reading group

This month we picked articles from SEDA’s 53 powerful ideas all teachers should know about blog.

Mike read Students’ marks are often determined as much by the way assessment is configured as by how much students have learnt

Many of the points made in this article are hard to dispute. Different institutions and subject areas vary so widely that not only are how marks are determined different between say Fine Art and Medicine, but also between similar subjects at the same institution, and also between the same subject at different institutions. This may reflect policy or process (eg dropping the lowest mark before calculating final grade).  In particular, Gibbs argues that coursework tends to encourage students to focus on certain areas of the curriculum, rather than testing knowledge of the whole curriculum.  Gibbs also feels these things are not always clear to external examiners. He does not feel that QAA emphasis on learning outcomes address these shortcomings.

The article (perhaps not surprisingly) does not come up with a perfect answer to what is a complex problem. Would we expect Fine Artists to be assessed in the same way as doctors? How can we ensure qualifications from different institutions are comparable? Some ideas are explored, such as asking students to write more course work essays to cover the curriculum, and then marking a sample. This is however rejected as something students would not tolerate. The main thing I can take from this is that thinking carefully about what you really need to assess when designing the assessment is important (nothing new really). For example, is it important that students take away a breadth of knowledge of the curriculum, or develop a sophistication of argument? Design the assessment to reflect the need.

Suzi read Standards applied to teaching are lower than standards applied to research and You can measure and judge teaching

The first article looks at the difference between the way academics receive training for teaching and the way research and the way teaching and research are evaluated and accredited. Teaching, as you might imagine, comes off worse in all cases. There aren’t any solutions proposed, though the author muses on what would happen if they were treated in the same way:

“Imagine a situation in which the bottom 75% of academics, in terms of teaching quality, were labelled ‘inactive’ as teachers and so didn’t do it (and so were not paid for it).”

The second argues that students can evaluate courses well if you ask them right things: to comment on behaviour which are known to affect learning. There didn’t seem to be enough evidence in the article to really evaluate his conclusions.

The argument put at the end seemed sensible: that evaluating for student engagement works well (while evaluating for satisfaction, as we do in the UK, doesn’t).

The SEEQ, a standardised (if long) list of questions for evaluating teaching by engagement, looks like a useful resource.

Roger read Students do not necessarily know what is good for them.

This describes three examples where students and/or the NUS have demanded or expressed a preference for certain things, which may not actually be to their benefit in the longer term. He believes that these cases can be due to a lack of sophistication of learners (“unsophisticated learners want unsophisticated teaching”) or a lack of awareness of what the consequences of their demands might be (in policy or practice). The first example is class contact hours. Gibbs asserts that there is a strong link between total study hours (including independent study) and learning gain, but no such link between class contact hours and learning gain. Increasing contact hours often means increasing class sizes which generally means a dip in student performance levels.   Secondly he looks at assessment criteria, saying that students are demanding “ever more detailed specification of criteria for marking” , which he states are ineffective in themselves for helping students get good marks, as people interpret criteria differently. A more effective mechanism would be discussion of a range of examples where students have approached a task in different ways, and how these meet the criteria. Thirdly he says that students want marks for everything, but evidence suggests that they learn more when receiving formative feedback with no marks, as otherwise they can focus more on the mark than the feedback itself.

The solution, he suggests is to make evidence-based judgements which take into account student views, but are not entirely driven by them, to try to help students develop their sophistication as learners and to explain why you are taking a certain approach. This article resonated with me in a number of ways, especially with regard to assessment criteria and feedback. There is an excellent example of practice in the Graduate School of Education where the lecturer provides a screencast in which she goes through an example of a top level assignment, explaining what makes it so good.  She has found that this has greatly reduced the number of student queries along the lines of “What do I need to do to get a first / meet the criteria”.  I also strongly agree with his point about explaining to students the rationale for taking a particular pedagogic approach. Sometimes we can assume that students know why a certain teaching method is educationally beneficial in a particular context, but in reality they don’t. And sometimes students resist particular approaches (peer review anyone!) without necessarily having the insight into how they may be helpful for their learning.

#EDCMooc – the view from the other side

By Hilary Griffiths

Now the dust has settled I thought it might be useful to post some thoughts on our EDCMOOC experience. Once a week educational technologists, students and academics had the opportunity to meet for a coffee, and to reflect on their experience of participating in a MOOC – these are some of the thoughts expressed during those meetings.

Only two or three of the group had participated in a MOOC before so it’s perhaps unsurprising that the most common reason for participation cited in the first meeting was curiosity – what exactly is it like to be a student on a MOOC?

The general impression after week 1 was one of feeling overwhelmed – both by the range of tools participants were directed to use, the percieved lack of explicit direction or course structure, and the amount of “noise” in the environment. Some participants struggled initially to make sense of how they were expected to use the tools (which were things like Facebook, Google +, and Twitter as well as in MOOC discussion fora.) One participant cited the fact that they didn’t want to have to sign up to Facebook or Twitter but through the ensuing discussion it became clear that given the number of participants you didn’t need to use all of the suggested tools, but could pick a couple you were most comfortable with and still get a good experience of the course.

It was interesting that the participants cited noise as adding to their feeling of being swamped by the MOOC – the sheer amount of information being uploaded, commented on, communicated, microblogged and hyperlinked to was overwhelming, especially if you arrived in a discussion or activity area some time after it had started.  Given the participants use a range of ways to filter and organise the information they receive in their life outside the MOOC, it telling that at least initially they did not seem to apply the same strategies within the MOOC.  Generally better ways to filter and surface activities was seen as key – along with some way of allowing late arrivals to jump in to activities  without having to wade through masses of information, for example a daily digest of key discussion board conversations to allow later arrivals to contribute to the current conversation more easily.

A concern from a current undergraduate student was the perceived lack of validation of her learning. Was she learning what she should be? Was her understanding correct? In the absence of feedback from the MOOC academics the student was relying on a validation by peer consensus in a course where a lack of academic rigour characterised many of the contributions.

My perception was that those who had the most enjoyable and engaged experience of the  MOOC engaged early and managed to form small, self supporting groups which helped reduce information overload and the lack of a present academic by filtering information, alerting group members  to things they may have missed and also offering feedback on their learning. Groups offered a way to move beyond the experience of the central discussion boards,  often characterised by a lot of posts but not a great deal of dialogue,  into an area where participants could start to develop a sense of the experience and expertise of the people they were communicating with. One benefit of the MOOC use of external social media like blogs and twitter are that these conversations can continue after the course has finished.  A final suggestion was that perhaps we should lobby for some kind of advisory service for students to consult before they sign up for a MOOC – MOOCAS anyone ?

6 very good things about MIT’s #medialabcourse MOOC

I started taking MIT’s Media Lab’s Learning Creative Learning MOOC (often referred to as #medialabcourse or LCL) at the beginning of February. It’s something I’ve done in my spare time rather than directly for work but it’s been a great experience and I wanted to reflect on what has worked so well for me.

1. Google+ communities. Google+ turns out to be really rather good for groups and group discussions. The combination of threaded discussion (with email notifications of responses) and micro-blogging type front-page (making it easy to scan through new posts) has certainly promoted impressively engaging and lively discussion. It’s even (and I can’t believe I’m saying this about a Google product) nice to look at.

2. Small groups. People who enrolled in time were placed into small groups, each with its own email list, and each encouraged to set up its own Google+ group. These small groups (my own included) have largely petered-out – but others have survived, often by picking up refugees from the less active groups, and I joined one of those. They provide a safer, less public, arena for discussion – especially for those people who are perhaps less confident or for material that doesn’t seem important / relevant / polished enough to share with the world.

3.Openness. LCL was designed to be almost entirely open, based on P2PU’s mechanical MOOC. Course reading is published on a public website and the main community is an open Google+ group. Weekly emails are sent out to remind people about this week’s activity and reading. Even with the small groups, I get the impression it’s those who left their Google+ communities as open who have survived because they could pick up new members. As well as being a Good Thing, this openness helps to make it easier to navigate the course, and to access the materials from a range of computers and devices.

4. Variety. Each week there are suggested readings, an activity, and further resources. There’s also a video panel discussion, and of course there’s continuous activity and discussion on the Google+ community. Early on the course, the course leaders stated explicitly that people should engage with what they can / what interests them and not feel they have to do everything. The variety of tasks and materials (some of the “readings” are short videos) make it possible to stay engaged even when you have little time to spare.

5. Events. There are live-broadcast panel discussion each week, directly relating to the week’s reading and activity. The video stream for these is embedded within a chat forum so that you can chat with your fellow students while you watch, and submit questions for the Q&A section at the end. These broadcasts feel very personal and inclusive, they are relaxed and conversational in tone. Course moderators join the chat rooms – providing helpful information, support with technical issues, and (maybe more than anything else) a real sense that the online participants do matter. In terms of a teaching device, I’m not sure how well they work – I find myself picking up fragments of the video and fragments of the chat and not properly engaging in either. But they can be useful place to reflect on and refine my ideas and they help give the course a nice pace.

6. Enthusiasm. Mitch Resnik, Natalie Rusk, and the rest of the course team exude enthusiasm for their subject, excitement about the course, and an openness that makes you feel like a real student. They seem friendly and genuinely interested in what online participants are saying. I think their attitude sets the tone for the community as a whole.

Bristol at the #EDCMOOC

A number of staff at Bristol signed up for the Elearning and Digital Cultures Mooc (Massive Online Open Course), here are some of the thoughts of staff now that it has finished

Joseph Gliddon – Learning Technologist, Technology Enhanced Learning Team

For the past 5 weeks my evenings have been taken up with the Elearning and Digital Cultures Mooc (Massive Online Open Course) and it has – for me – been a great learning experience.

It was a chance to reflect on the day job but at one step removed rather than “How can I use technology to improve the learning experience at Bristol” it was more “What is technology doing to learning (and to humans), and is it a good thing”.  Also as a Sci-fi fan it was enjoyable to engage with my interests in an academic setting

It was a “cMooc” with the c standing for connectivism as opposed to an xMooc, which is about providing information in a structured form to the students (the “best” definition of x I can find is x = instructivist  – never let spelling get in the way of a good acronym).  The connections were – for me – what made the course so engaging, the reflections of others on the course materials were incredibly rich and interesting (the course materials were also good).

At the end of the course I had submitted my digital artefact, obtained a “Statement of Accomplishment with Distinction” and (one of my personal goals) extended my personal learning network by over 50 useful people.

So are Moocs the end of University as we know it?  I would have to say no, and there are a lot of reasons why not which I dont have space to go into here, so instead I will close with a brief example of what can be so special about studying at University.

I was working with Dr Tamar Hodos in their office when a student came in to pick up their essay and feedback – having checked with the student that it was ok if I was in the room, the academic went over the paper with the student discussing what was good, where improvements could be made etc, the conversation moved to the teaching and time spent in the lab (student suggested longer lab sessions, and they discussed the potential benefits of this).  This was a really detailed learning experience that (provided the student does take the steps suggested) will make a real difference to the students study.

Now I do realise you cant scale that 1 to 1 detailed contact with an academic up to a 40,000 user mooc, and I think that is why sometimes traditional is best (and yes I did tell Dr Hodos how impressed I was).

Joseph

Roger Gardner – Learning Technologist, Technology Enhanced Learning Team

I enrolled primarily to see what the MOOC experience might be like. On reflection I don’t think this was sufficient motivation to get me very far. After the first week, despite participating I found myself quite unengaged with the content and much of the discussion. So I am looking forward to the ALT Mooc  (ocTEL) in a few weeks as I suspect the content of that might engage me more. To some extent I got what I wanted from the course, in that I had first hand experience of MOOC-ing and some of its challenges . I know that next time I  need to allow a realistic amount of time to participate, not go on holiday for a week in the middle somewhere with very flaky Internet access, and try to identify and connect with some other participants with similar interests early on if possible. I never got round to the assessment, but started to write a limerick which expresses some (I suspect common) MOOC emotions. It’s on Soundcloud so please feel free to add your comments.
Roger

Roberta Perli – Learning Technologist, Technology Enhanced Learning Team

First, I decided to sign up for this this Mooc because I am very interested in online learning. In April last year I attended a 4 weeks online course in e-assessment run by Jisc which was just great, good size (about 30 people) and about the right length.

After reading about Moocs, xmoocs and cMoocs in our reading group during the summer I thought that the Mooc in digital cultures offered by Coursera would be a good opportunity to learn more about Moocs and this ‘innovative’? model of online education. I liked the pre-course activities and interactions with different social networks such as the facebook group, which continues to be fairly active!  I think I got a lot out of the social network, interesting discussions, useful tips, helping people with their research, peer support with artifacts, sharing resources (e.g. list of tools), although I felt quite overwhelmed by the constant streams of information and interactions. I enjoyed the topic of some of the videos but I wasn’t sure about the format ‘four videos + core readings’ for the entire course.

All in all my first Mooc was an enjoyable experience and I think I will sign up to another Mooc in the future if I have have the time to devote to (it needs more than three hours) but one thing I felt was missing was more support in term of online learning strategies to help students engage with the student generated content and learn by interacting with their peers.

Roberta

Active learning – notes from reading group

Active learning might be an unhelpfully broad topic but there are some very helpful ideas in these papers.

  • Bonwell, C. (1991), Active learning: creating excitement in the classroom, Eric Digest – The article starts by defining what AL is, the key factor being that students must do more than just listen e.g. read. write, discuss, problem solve. It identifies the main barrier to use of AL as risk, for example that students will not participate, or that the teacher loses control.  It suggests ways to address this for example by trying low risk strategies such as short, structured, well-planned activities.
  • Prince, M. (2004), Does Active Learning Work? A Review of the Research, Journal of Engineering Education, 93(3), 223-232. Splits active learning into constituent parts and looks at the evidence for (often relatively minor) interventions covering each of these parts, in an attempt to identify what really works. A useful reference for anyone looking for quantitative evidence for active learning type interventions and a useful discussion of what leads to successful (or unsuccessful) problem-based-learning.
  • Jenkins, M. (2010), Active Learning Typology: a case study of the University of Gloucestershire. The paper describes how an ‘active learning ‘strategy has been implemented at the University of Gloucester. In the first paragraph Jenkins provides some references on active learning to unpacks its meaning that helped us to better understand the term and put it into context,  for example, …the role of the teacher is not to transmit knowledge to a passive recipient, but to structure the learner’s engagement with the knowledge, practising the high-level cognitive skills that enable them to make that knowledge their own (Laurillard, 2008; 527). page 2. At the same time this is compared to the understanding of ‘active learning’ of the staff at the university which through  a survey were asked to identify their conceptions of active learning. The results identified three categories ‘families’, 1) external (student are active when they learn by doing), 2) ‘internal (student are active when they are engaged in cognitive processes) and 3) holistic (it is a composite of the two, and students are active learning is generally investigative, developmental, creative. An interesting perspective is a distinction in the interpretation where the emphasis is placed on the student or the teacher, Is active learning what the teacher gets the students to do or what learning is done by students? The data showed that there is a split between some staff practising ‘active teaching’ and other practising ‘active learning’. The outcome of the project has produced a framework for staff to work with which is very useful and identifies common elements of active learning in these five categories: Co-learning opportunities, Authenitcity, Reflection, Skills development, Student support.

Applying the Mumford method to report-writing

Philosopher Stephen Mumford has developed a process for writing academic papers, known as the Mumford method. It involves producing a summary of your argument in a very particular format, using this summary when speaking (both as notes for yourself and as a hand-out for the audience), refining it after feedback each time you present, and eventually writing up. It’s been used by professors through to a-level students and always sounded like a convincing idea.

I decided to try it out when working on a recent internal report on Open Education at Bristol, in collaboration with my colleague Jane Williams, and it worked well. We initially produced a handout, roughly in the format Mumford describes. After several iterations of this handout we used it as our plan for the final briefing paper.

Although we started with the Mumford method instructions, I made some small refinements for the slightly different circumstances. My summary was:

  • single-sided
  • landscape with 4 columns of 10pt text (as the points being made tended to be relatively brief)
  • sub-divided into section headings (these did not neatly fit with the 4 columns but that was fine)
  • produced in Google Drive to allow collaboration (this involved using a table for the columns – a little fiddly but workable)

We used this handout both for meetings with individuals and when presenting the paper at larger meetings for consultation, and it was very effective as an aid to discussion.

I was tasked with writing up and found I could relatively quickly write up the report based on the outline (which I had talked through many times by this point). Each of the four columns produced almost exactly one A4 page of relatively spare prose, more than I had anticipated. But the argument remained very clear and it was extremely easy to produce a summary of the key points, drawing almost directly from the handout. It’s definitely something I’ll use again.

Education horizons event

By Roger Gardner

This was a thoroughly enjoyable event organised by the University of Bristol Graduate School of Education. Coinciding with the School’s centenary celebrations it aimed to look ahead to potential developments and changes in education over the next 100 years. All of the speakers were excellent and thought-provoking, but here are a few personal highlights.

Dr Richard Harris kicked off by suggesting that the current face-to-face University experience will become the exception rather than the norm in future with the majority of learning in HE being “pay as you go” from large online universities, backed by a mixture of philanthropy and commercial interest.

Professor Sri Subramanian outlined some of his work on brain computer interfaces and gestural interfaces, as well as morphees, (“self-actuated flexible mobile devices adapting their shapes on their own to the context of use in order to offer better affordances”).

Professor Mike Fraser reassured all those teachers present that the “Robot teacher” was not coming any time soon, stressing the importance of physicality and co-presence in learning environments and highlighting the gap and the nuances separating best and mechanical practice.

After a delicious lunch (as promised!) we re-convened to vote on some of the predictions, discussing whether they were likely to happen in 10, 20, 50, 100 years or never. Opinions were quite varied on many of the statements we considered (most are available on Google Moderator.)  There was quite a bit of discussion on the subject of so-called “smart drugs”, whether use will increase and and to what extent consumption of these can be considered “cheating” when other stimulating drugs such as caffeine are commonplace.

One emerging theme of interest seemed to be the area of genetics and education, for example speculation and concerns around genetic enhancement of learning ability. Another was wearable devices (highlighted in the Horizon Report 2013 shortlist), including the possibility of student learning being monitored through use of implants or wearable devices.

So plenty of food for thought, and a stimulating range of perspectives from the invited speakers.  I particularly liked the conversational approach of the event, in which Paul Howard Jones chatted with each panel member for ten minutes before inviting questions from the audience.