Mentimeter was founded on a belief in the capacity for interaction and engagement to harness the power of together, unlocking the potential of presenters and audiences in a way that traditional presentation tools simply cannot. But this belief wasn’t based on much more than a gut feeling… until now.
This project began with the idea that allowing for an audience to interact with a presentation in real-time would have a positive effect on the audience experience as a whole. The thinking was that facilitating interaction through our presentation platform would lead to reduced boredom, increased engagement, increased interest, and increase the audience's capacity to learn. This "Mentimeter Effect" is something we experience internally and hear our users refer to all the time, but we had no real way of quantifying it.
So we wanted to know: was this just a hunch? Maybe our own internal bias that was telling us this was the case? Or maybe there is no difference between us and traditional presentation tools? We were curious, so we decided to try and find out.
We needed someone who could independently conduct empirical research into how Mentimeter performed compared to other presentation tools. So we reached out to Emotiv. Emotiv is a bioinformatics company which specializes in carrying out brain studies using special devices called electroencephalographs, or EEGs.
Emotiv use EEGs to track cognitive performance and to monitor emotional responses in participants, which was exactly what we needed to find out how people respond when they use Mentimeter. Experts, like the people at Emotiv, can then read this data to tell us what is going on in the participants’ heads.
In the first fully remote study of its kind, we set out to quantify the level of participant engagement when they experience a Mentimeter presentation. We also wanted to better understand how this engagement affects emotional responses such as boredom or interest, and the cognitive effects on the participants’ capacity to absorb information. We set out to answer the following questions:
Electroencephalography (EEG) measures the electrophysiological activity of the brain's cortex (the top surface) from scalp electrodes placed on the head, to produce a set of signals. These signals are presented in raw digital data streams as waves. Brainwaves, like sound or light, can differ in frequency and amplitude. The frequencies are often split into different groups (from slowest to fastest) to understand which waves are more dominant at a certain time.
Emotiv then mathematically transformed these waves to tell us a meaningful story about what was happening in the participant's brains. Further, we can quantify these patterns with algorithms that track very complex patterns, affective and cognitive states.
So we knew the questions that we wanted to answer, but what should we measure to try to do this? In working together with Emotiv and using the EEG devices we were able to measure 5 relevant brain activities:
To measure these five cognitive activities, Emotiv recruited 28 participants from 15 different countries across 9 professions to help us to find the answers we needed.
So having assembled the participants and with the parameters of the study outlined, Emotiv were ready to begin independently conducting the research.
The participants remotely watched two live presentations: one built in Mentimeter and the other in Powerpoint. The two presentation topics were “AI in Music” and “Harmonic Series”. They all logged in to the call, set up their headsets and their signal quality was checked. They did a baseline and a small questionnaire before beginning, before watching the presentation. They had a small break before their second presentation. Participants were split into 5 groups in order to counterbalance delivery order and control for order effects.
We had hoped that the results of the research would help to reinforce our hunch, at least a little bit. But in the end, what we got back from Emotiv exceeded our expectations!
The headline results were these: the data showed that - across the experiments - when we compared the participants' experiences of Mentimeter vs. Powerpoint, for Mentimeter:
This finding was demonstrated in the Mean Performance Metric, which showed that across all the metrics we were measuring, Mentimeter performed better vs. Powerpoint. Beyond the headline mean results, there were also a number of interesting findings in the finer details.
We can see a clear distinction between levels of boredom, engagement, attention, interest, and cognitive load in Mentimeter’s interactive events vs. Powerpoint’s standard slides. With Mentimeter events outperforming Powerpoint in every category.
When we look specifically at the data relating to boredom and engagement in the two experiments, we can clearly see the importance of an interactive ice-breaker in decreasing levels of boredom and peaking levels of engagement. In the “AI in Music” experiment in particular, we can see that Mentimeter outperformed Powerpoint for the entire duration of the presentation.
We could clearly see the benefit of engaging your audience early in order to ensure you have their attention later on in the presentation. Those presentations that grabbed the attention of participants with an interactive icebreaker showed greater attention later on.
One finding we did not anticipate but were happy to receive was the increased potential for Mentimeter presentations to unify the audience in the room compared to Powerpoint. In the above graph the higher and narrower the peak signifies increased unanimity of response, and we can see a clear difference between the two presentation platforms.
Particularly exciting for us was the finding that interactive slide types vs. non-interactive slide types showed such a remarkable difference in the effects we measured. With non-interactive slides showing much higher levels of boredom and much lower levels of engagement and attention compared to the interactive slides.
Specifically these findings showed the benefit of opinion slides. These allowed participants to explicit voice to their thoughts and opinions, and this elicited the strongest response for every metric we were measuring for. Which empirically demonstrated the original idea behind Mentimeter, which is that interaction and audience voice makes for more engaging and attention holding, and less boring presentations!
We believed internally in the efficacy of our product, but we wanted to be sure, rather than just basing our claims on our earnest belief based on personal experience. We weren’t sure what Emotiv would tell us when they came back to us with the results. We hoped that they would help to confirm our belief about the value Mentimeter gives to an audience.
In the end the results came back and showed that Mentimeter performed beyond even our own expectations. We were so proud to see that the experiments showed Menti excelling in every metric we were measuring for. As we had hoped, Mentimeter was shown to reduce boredom, increase attention, engagement, and interest; but the experiments also demonstrated Mentimeter’s ability to increase the capacity for audience members to hold and retain information during and after a presentation.
In highlighting the benefit of using opinion slide types to invite input from the participants the experiments demonstrated the importance of letting people have their voice heard when engaging with the audience. Showing conclusively that modern leadership is about listening in order to be heard.