“You have to learn the rules of the game. And then you have to play better than anyone else.” – Albert Einstein
Was your event a success? Did your audience go away having learned something of value? Are they likely to change behaviour as a result of the presentations?
And how do you know if your audience was engaged?
The key performance indicators we measure will depend on the type of event and what the organisers are looking for. For physical events, it might be: how many registered audience members attended? How did they respond to sessions and the overall experience?
One of Open Audience’s differentiators is that we can also give organisers insights they might not otherwise have.
For example, if organisers have specific goals in mind, we can determine which questions weren’t answered.
This is something we have long done in the physical world and we have successfully adapted our approach to gathering KPIs for the virtual world, providing information that many organisers thought were only possible in the physical space, such as who attended, how often they answered a poll, and even how engaged the audience was.
We can help drive discussions and ideas generation on specific topics, which means that you can be more robust in terms of thinking about what you’re trying to measure. There are the obvious layers, such as, how interesting was a presentation? How well presented was it? Which speakers made an impact, and which didn’t?
But the next layer is really where the value of your event lies, which is where you ask the audience: Did you learn anything new? Did you feel it was an unbiased and scientifically valuable discussion? And are you going to do anything new as a consequence of being in this meeting (such as change the way you review patients, ask different questions or do more research)?
What we have found in our analysis is that the audience might have found the sessions very interesting and relevant, but they have no intention of changing their practice because they are happy with the way things are, have evidence their current approach is working, or they say they need more robust evidence before they consider a change of approach.
These types of KPIs can be measured in both the physical and virtual world, but we can actually be more robust in the virtual world because we can continuously measure feedback through polls and other forms of audience interaction. For example, in a session designed for nurses we measured what they knew before the presentation vs. what they knew and thought at the end, and identified any gaps to be followed up.
This is particularly relevant when it comes to continued medical education and certification for that CME credit. I equate this to tracking users through a learning management system, which pharmaceutical companies use to ensure their sales forces understand compliance requirements.
When I worked in the industry, it wasn’t uncommon for everyone to fully pass online questionnaires and then two weeks later there a challenge appeared in the real world which ran counter to their success in online learning.
We realised that with learning management systems, team members would often go to the evaluation form at the end of each module and try to complete it without reading or watching the module.
If they didn’t pass, they would again try to complete the test without doing the module. If they didn’t successfully pass a second time, then they might speed through it.
So we realised we should track how long users spent on each module. What was the average time spent viewing an example? If it’s less than a minute on a 15-minute module, perhaps they are demonstrating they can answer the questions, but they also might be skipping content without understanding it and missing out on crucial knowledge.
If we transfer that to virtual CME sessions, and you are an organiser wanting to know if your attendees have actually been viewing and engaging throughout, this is often difficult to do if they don’t have their camera on. One simple approach may be to randomise 10 polls during the day and link accreditation to successful results from these polls. If they don’t successfully pass, or don’t answer all the polls, then they may not be fully engaged.
When it comes to the sessions themselves, the way you measure the value of what you are delivering and improve your audience engagement is by getting your attendees involved in the conversation.
Polls are an effective method, but so too is the quality of the presentation and the conversations that follow. Panel discussions and facilitated Q&A are very important.
Does this resonate with your thinking? Want to hear more? We can help you measure your events more effectively and better engage your future audiences.
Leslie Robertson is the Founder of Open Audience, an audience engagement consultancy that specialises in making life sciences meetings more engaging with more positive, successful outcomes – whether in-person or in the virtual space. The Open Audience team helps to strategise and prepare pre- and post-meeting as well as providing real-time support and guidance during the meeting. Open Audience also offers customisable, multilingual engagement platforms that include interactive polling, surveys, and ideas exchange.