If universities can be characterised as places of research, innovation and creativity, it may seem counter-intuitive to find that technology for teaching, learning and assessment (frequently referred to by the more generic term Technology Enhanced Learning -TEL) is often met with scepticism or cynicism. Why wouldn’t teaching academics, at the cutting edge of their own disciplines, jump at the chance of using tools that promise so much?
This is a guest post written by Martin Compton who is Senior Lecturer teaching, Learning and Professional Development at the University of Greenwich in the UK.
If universities can be characterised as places of research, innovation and creativity, it may seem counter-intuitive to find that technology for teaching, learning and assessment (frequently referred to by the more generic term Technology Enhanced Learning -TEL) is often met with scepticism or cynicism. Why wouldn’t teaching academics, at the cutting edge of their own disciplines, jump at the chance of using tools that promise so much? Engagement! Collaboration! Data! Part of the problem may be that TEL offers too much and its immediate relevance and utility can seem unclear. Systems for data and resource management are lumped together with online software and downloadable apps under the TEL umbrella. Mandatory systems such as a university’s VLE might appear clunky, unintuitive and uninspiring and such mandating can taint a busy academic’s perception of TEL (Compton & Almpanis, 2018). This article starts with a personal perspective on a major TEL issue and provides a rationale for use of cloud-based apps such as Mentimeter. It then outlines the concept of student response systems (SRS) before covering the role that Mentimeter plays in the University of Greenwich’s TEL approach.
When we think of TEL in education the most prominent tools are the ‘big’, institutional tools such as Virtual Learning Environments (VLEs). VLEs are often presented as tools that aid big data collection and facilitate group management. We can click count; see when a student last logged in; send broadcasts to the whole programme group and attempt to start online discussions. But few lecturers place the same value on that data as institutional managers and many struggle to see the pedagogic benefits when set against the costs of learning how to use the systems in the first place and then populating and maintaining them; especially where their students are of the traditional studying-at-the-university type. Tools such as this are a big and essential part of the TEL offer but this negative perception of the big TEL tools reflects one often held by students too (Oakman, 2016) and can taint lecturers’ perceptions of what TEL can achieve for them and their students. The perception of TEL can be further tainted by such things as one’s own sense of digital self-efficacy (Wingo et al., 2017); academic ‘frailty’ leading to fossilisation of pedagogic practice (Kinchin and Winstone, 2017) or fears around data security or reliability of technology itself (Butler and Sellbom, 2002).
I am sensitive to these concerns in my role as an ‘Academic Developer’ and I advocate a ‘pedagogy before technology’ approach in all the work I do using and supporting the use of technology with teaching academics. My own disciplinary background is history and I think it important to stress to colleagues that although I’m interested in the potential of technology I’m not a ‘techy’ person. When I use technology for teaching I want to use things that are fit for purpose and, frankly, idiot proof. In other words, I need a tool that enables me to put my pedagogic goals first, has a shallow learning curve (for both me as the ‘author’ and students as users), is adaptable to multiple contexts and emphasises the ‘enhanced learning’ over the ‘technology’ in the term TEL.
There are countless tools available to lecturers. Perhaps I am a little ‘nerdy’ in that I enjoy trialling productivity apps and education-specific tools (especially those that are free to trial, easily accessed via apps and/or online and are intuitive for creators and students who then use the resources created). As part of a university-wide endeavour to find a suitable SRS, I and colleagues in the Educational Development Unit (EDU) at the University of Greenwich reviewed dozens of SRS and filtered them against criteria including ease of use, flexibility and potential benefits across faculties (Compton & Allen, 2018). SRS are BYOD-friendly, cloud-based tools that have supplanted the varied (often unreliable and troublesome) ‘clickers’ systems invested in by many HE institutions in the past. SRS enable students to respond synchronously and asynchronously to polls, questions and other seminar/ lecture content and can boost engagement levels, improve outcomes and change the paths of interaction (Compton & Allen, 2018).
What we found ultimately was that the tool that held the most potential, was generally found easiest to learn /use, looked professional and worked well with student devices and had the widest range of options (as well as fulfilling data storage and other ‘back end’ technical concerns) was Mentimeter (https://www.mentimeter.com/). This conclusion has been reached by a number of colleagues at other institutions including Chris Little at Keele (2016) who describes the system as ‘a class above clickers’. He emphasises the benefits for staff as Mentimeter “offers highly-customisable activities which can facilitate an instant analysis of responses, provide downloadable data sets and create an interactive teaching and learning experience for groups of varying sizes” (ibid., p.3).
We now have a university-wide licence for Mentimeter with single sign on (SSO) but found that over 130 university staff already had access to free accounts as a consequence of the trails, promotion and use by the EDU and word of mouth recommendation within departments and faculties.
In some departments there has been some real traction. One team in the Faculty of Education and Health has been using Mentimeter consistently during the last academic year. Some of the lecturers are using it to give (anonymous) voice to students ahead of discussion of difficult or controversial topics. Others are using it to consolidate cognitive breaks and to allow quick (but again anonymous) testing of how far the whole body of students has grasped key concepts (not just the enthusiastic students nodding happily in row one!).
In the Faculty of Business it is used considerably in the Personal and Professional Development (PPD) elements of courses by some lecturers who have designed activities built around one of the different questioning functions Mentimeter offers each week. For example, they might encourage students to identify the top three job skills they think they need. The results appear in the form of a word cloud and the content is then visible, owned and made subject of debate. Alternatively, students are invited to comment on attitudes to products or ideas using the matrix question format and then are able to see how their own sense of positionality compares with their colleagues. Rationalising such things in subsequent debate enables the lecturer to draw out higher order thinking from her students.
In one healthcare department the quiz function is used frequently as a formative assessment tool to consolidate prior learning and/ or as an energy-level raising technique during lull periods. In Engineering it is being used as an asynchronous tool for gathering ideas and opinions of topics ahead of a lecture so that the results can be discussed in class. I have also spoken to colleagues in partner institutions overseas who are using the tool to challenge student assumptions about what a university education in their context should be like.
Mentimeter can be set to work synchronously in-session or asynchronously so that a live link can be shared via the VLE. It also offers easy to use embed codes so that the questions or polls can appear within a VLE page, adding dynamism, interactivity and interest to what are often otherwise static pages.
We have yet to fully exploit the opportunities the site licence has for encouraging student use and for sharing of resources as well as welcome new features such as PowerPoint upload. Some have already expressed concerns that by having a site licence the students will bore of these opportunities for interaction. I say: bring that day on when students are demanding more of us in terms of technology and the interaction, assessment and feedback opportunities it offers. We have been bombarding them with PowerPoint for a long time and I think if resistance to new ways of fostering interaction such as Mentimeter was genuinely about student boredom then we’d have abandoned VLEs and PowerPoint eons ago! I urge all my colleagues to try Mentimeter with their students; it really does fulfil my essential criteria of being easy to use and, most importantly, fit for pedagogic purpose.
Butler, D. and Sellbom, M. (2002). ‘Barriers to adopting technology for teaching and learning’. Educase Quarterly 2, 22-28.
Compton, M., & Allen, J. (2018). Student Response Systems: a rationale for their use and a comparison of some cloud based tools. Compass: Journal of Learning and Teaching, 11(1).
Compton, M., & Almpanis, T. (2018). One size doesn’t fit all: rethinking approaches to continuing professional development in technology enhanced learning. Compass: Journal of Learning and Teaching, 11(1).
Kinchin, I. M., & Winstone, N. E. (Eds.). (2017). Pedagogic frailty and resilience in the university. Rotterdam: Springer.
Little, C. (2016). Mentimeter Smartphone Student Response System: A class above clickers. Compass: Journal of Learning and Teaching, 9(13).
Oakman, H. (2016). What students really think about using a VLE. Online. University Business: Available at: https://universitybusiness.co.uk/Article/what-students-really-think-about-using-a-vle 2 September 2018
Wingo, N. P., Ivankova, N. V., & Moss, J. A. (2017). Faculty Perceptions about Teaching Online: Exploring the Literature Using the Technology Acceptance Model as an Organizing Framework. Online Learning, 21(1), 15-35.