Supercharge creative effectiveness: avoid underperforming creative through new and better ad testing techniques
Why doesn’t every campaign win a Cannes Lion for Creative Effectiveness? And how long can brands continue to roll with low advertising ROI, by topping up media budgets?
The industry debate rumbles on around the long and short of creative effectiveness. The performance marketers collect reams of data on channel and format. We can certainly tell WHICH ads perform best, but if we want better results and greater effectiveness for our brands, we have to know WHY certain creatives work and others just look good.
If we unpack the creative process, it typically starts out with good science. Consumer or B2B insights – what matters to the audience, which value propositions stand out, what emotional drivers are in play – the which, the what and the how.
As the insights debrief reaches planning, the starter thinking forms, it becomes the creative brief, and when the creatives get rolling, firstly the start point is a little way off from the original insights (more on this later) – and inevitably when the creatives have done their thing, the brand manager may well be looking at a number of different creative strategies – all of them fulfilling the brief, and any of them potentially delivering on the intended outcome.
Picking your best-performing creative – the one that drives action
But which way should we go creatively? Will it work? Can we drive action? How do we know where we can fine-tune or make improvements? The holy grail for brand marketers and budget holders. The principle of ad testing has been around for some time – and use is growing as the stakes get higher for creative impact.
Common sense says, ‘ask the audience’, to find out which creative we should run. It works, but only to a point. Explicit testing, such as qualitative reviews or focus groups, help to explore conscious effects, for instance around content, distinctiveness, or brand recall, and take away messaging. This traditional approach to explicit insight – where we literally ask the question and receive the answer – focuses only on what people SAY they like or dislike about your ad; it doesn’t get close to how they FEEL by understanding what is happening subconsciously within the brains of our target audiences.
In fact, in some ways, focus group ad testing alone can give the wrong signals to marketers about which creative seems to work best. It’s a hugely important and cautionary note – even where due diligence is in place to test and optimise ads, it’s so important that the insight being acted upon reveals the true intent of the audience – without a deeper emotional perspective in the testing, we risk picking the ad which creatively gets highest marks, but performance-wise may not fulfil that promise in driving the intending behaviour or action.
Reading faces & unlocking consumer minds for more effective comms
The connection a brand builds with its consumers is emotional. It goes beyond our rational, conscious process, and it plays a crucial role in driving our behaviour. Great creative delivers on an emotional level. When we truly believe, we make subconscious associations – which aren’t our actual thoughts – and we are far more likely to act on our beliefs. And this means campaign KPIs look better in the results we can achieve.
Creative testing, therefore, also needs to establish the connection on deeper, implicit levels – learning about attention, emotion, motivation, meaning and impact in driving action.
Through our online panel, we’ve put together a far more effective way of testing ads, to find clearer truths around performance. We track facial expressions, measuring emotion moment-by-moment as the creative is sampled. Neuroscience right at the heart of emotional response, presented back in peaks and troughs so we can refine, build and optimise the creative before (preferably) or after it hits the market.
Applied neuroscience can unlock what actually matters to your audience, pinpointing the elusive WHY of your creative hitting the spot or missing the mark. We use Reaction Time testing to go far beyond what people are able to articulate.
We use it to understand their level of conviction – when they say yes, we can interpret the strength of the ‘yes’ they really experienced, and therefore the implicit connection of the creative to trigger the desired behaviour for the brand. Hesitation definitely matters (more on this here) – by measuring how much we pause before we hit ‘like’, marketers can get to the real insight they need to tweak ad creative and comms messaging, to get them closer to the immediate, innate response that will dial-up how the whole campaign performs. We wrap in the explicit responses, and we bring together all three of these techniques for a much more comprehensive way to test ad creatives and predict performance in the wild.
We get it – these campaigns have to roll. You just need to plan in one week for our Creative Connection ad testing package – we’ve made all of this science available through online testing. As soon as we receive the creative, we need just one week to test and share back the recommendations – whether you’re looking at adcepts, creative routes or finished ads, we can help fine-tune your approach with a short turnaround. And what are a few days of extra testing, compared to the lifetime performance of your campaign? If you want to win a Lion, push new creative boundaries, or better forecast performance and sales impact – we know it’s not easy, but with the right tools, we can get science and creative working hand in hand for your brand to find your creative edge.
If you’d like to learn more about Creative Connection, how it works and our client brands who use it, drop us a line here and we’ll send you further details.
This article was written by Simon Collister, Human Understanding Lab, Planning Director firstname.lastname@example.org.