Matthew Robinson, Analytics Director, London
So I have a confession. I get a bit twitchy when innovation, creative technology and bold design are heralded as the go-to industry jump-leads in advertising and media. Working in a creative industry, that might sound like heresy. But as someone who works in analytics, and aspires to support both Creatives and creativity, I worry that data still doesn’t get called upon enough to guide decisions and to inform the creative process.
To my mind, Nielsen are spot on believing that more research equals more questions equals more answers equals a better product. As such, getting excited about the latest new and shiny creative tech is all well and good, but not if it acts to lessen people’s commitment to research, robust testing and data-driven design, in support of driving the on-going optimisation of any solution.
To a degree, the industry-wide resistance to routine testing is understandable. There’s a perception that it’ll ramp up costs, hamper innovation, fuel conflict, and put Creatives on the defensive. Quite famously, Google’s Visual Design Lead quit a few years ago because he resented having to make decisions based on test results, rather than his own opinion. He wrote at the time; “Data eventually becomes a crutch for every decision, paralysing the company and preventing it from making any daring design decisions.”
But one could argue that testing actually helps foster more creativity. It encourages more variations, more potential solutions, and welcomes the bold and radical ideas as much as the safe and sensible ones. The only rider is that the strongest results, not the firmest opinions, have to hold sway. As testing guru Craig Sullivan once reflected; “It doesn’t matter what I think.”
And that for me is the salient point. Designers and developers must always remember they aren’t creating stuff for themselves, they’re working things up for users and consumers. And if that means more creative decisions get to be based on lifting conversion or increasing revenue, that’s got to be a good thing right? If it works for Google, it can’t be bad for the rest of us.
That said, it’s wrong to think we need to choose between numbers and creativity. A strict adherence to testing each and every creative choice would be as limiting as always relying on gut instinct, best practice, or aesthetics. For the same reason that you have to moderate your artistic and technical solutions against what consumers actually need, hard numbers shouldn’t pull rank unconditionally, particularly if they risk undermining a long-term strategy, an overall user experience, or a brand’s credibility. Besides, it can of course be risky only going with what users say they need. The customer is not always right, and doesn’t always know what they need. If Henry Ford had asked people what they wanted, they would have said faster horses.
1. Identify which business objectives a creative solution is intended to support, and ensure Creatives know them. What does the creative have to deliver in order for the business to achieve its stated aims? Steve Jobs was always adamant that the business of design is not so much concerned with how something looks, but how it works. You need first though to have a proper grasp of what “it works” actually amounts to.
2. Detail the end-to-end user journey and map out the specific actions people need to carry out for the identified business objectives to be met. User experience design is nothing new, yet I remain unconvinced that it’s founded on systematic qualitative research and/or usability studies frequently enough. Which is a shame. So much of what the creative tech solution needs to offer in order to work would be revealed if greater attention was paid to validating form and function up front. As Debra Dunn, a professor at the Stanford Institute of Design notes; “It is more from engaging with users, watching what they do, understanding their pain points, that you get big leaps in design.”
3. Establish performance KPIs on the back of the agreed user experience design. And, importantly, focus most attention on metrics that are properly actionable – be they the ones that will drive the others, the ones belonging to any sort of conversion funnel, or the ones that can very easily be made the subject of split testing.
4. Make testing (be that A/B or multivariate) a continuous process. It shouldn’t be an ad hoc exercise. The wonders it can do for a solution’s efficiency ought to make it a no-brainer. Here, digital can and should take a leaf out of the direct marketing playbook. Few working in DM would think to run a large campaign without first placing a significant portion of the prospects into a test control cell. And as well as always-on testing, it pays to constantly re-assess visitor satisfaction and intent too. So no encasing the agreed UX in concrete. A data-led creativity and design process is never done and dusted.
In summary, by all means seize a hold of the exciting possibilities that creative technology can deliver. But, as much as possible, try to innovate in accordance with analytics best practice and data-led design. Let insights and proven results aid the business of being creative.