2023 looks to be another long, turbulent year. Disruption necessitates innovation, and there’s plenty of disruption to go around. Generative AI such as ChatGPT, Stable Diffusion and Dall-E are enabling content creation at unprecedented scale. Meanwhile, machine learning is fast-replacing humans in optimizing campaigns.
Ironically, the speed and scale of automation emphasizes how technology and marketing are more than a numbers game. The internet is so inundated with low-quality content that Google is reworking how sites are ranked with it’s Helpful Content Update, described as a “shakeup similar to Penguin’s launch 10 years ago”. Meanwhile, Facebook users’ satisfaction and retention improved after receiving less notifications.
There’s a palpable feeling that we’re at the end of an era. The migration to digital and social is done. The next wave of change is AI and machine learning, signalling a return to basics: empathy and understanding of the customer, relationship-building with opinion leaders, and creative storytelling that ties everything together.
How AI doesn’t work
Despite AI’s advances, the classic computing adage applies: Garbage In, Garbage Out. Data is great at understanding known quantities. For instance, dashboards provide insight into historical performance, while generative AI is trained on bodies of existing work. Conversely, without human intervention, they will gloss over the unknowns.
Building a leading brand means starting trends and creating demand where none exists, which is the opposite of how current AI works. Some of my most impactful content projects had no engagement value at the time they were created. For instance, asking digital marketers to use decades-old Marketing Mix Modelling in the 2020s? When it was first published, that was practically unheard of. Today, it’s part of the wider conversation, including an in-depth presentation by Les Binet.
The process to get buy-in for that was very human conversations: our audience isn’t thinking about this, what we measure can’t capture it well, but we need to steer the conversation to drive long-term demand. More 1970s Theory of Reasoned Action than 2010s Digital Age Hypodermic Needle. Data did eventually come in useful to refine the delivery, but only after the humans involved could agree on a very counter-intuitive direction.
Certainly, it’s not what ChatGPT would have recommended. Prompts will typically generate a moderate answer that hedges between known schools of thought, then concludes with no strong opinion. That’s by design. A human needs to commit to a decision, robots can’t do that on our behalf. Making these decisions means looking at areas AI cannot cover yet.
The changing human role
While humans are far from obsolete, job descriptions might be. To start off, there’s the creative process. The hardest part about creating content is not the execution. It’s the research and planning. Defining a specific audience, understanding their culture, adjusting for competitors who are also using AI, then navigating the tightrope. Tip to one side and appear inauthentic and unoriginal, fall to the other and spark controversy and irrelevance. Anyone can write words and push pixels. A creative’s job is to provide the right ones for an increasingly complicated context.
Then there’s the technical details. What’s a GAN, an LLM, a learning set. How does the algorithm work to bring it together, and what levers can be pulled to derive more originality and insight. This will culminate in deep integration across generative AI, asset management and first-party data. Once again, humans are needed to map out their organization’s unique needs, then engineer systems to collect and associate relevant data. There needs to be a clear journey from AI prompt to derivative content to measurement models and forward iteration.
“Prompts will typically generate a moderate answer that hedges between known schools of thought, then concludes with no strong opinion. That’s by design. A human needs to commit to a decision, robots can’t do that on our behalf. Making these decisions means looking at areas AI cannot cover yet.”
This will lead to increasingly fluid job descriptions. The classic Marketing and Communications background spanning sociology, behavioural science and design theory is still essential, especially now that the end of cheap money is correcting the skew of Marketing’s 7Ps away from promotion. At the same time, Computer Science backgrounds will bring much-needed skills for AI adoption.
So far, they’ve been separate worlds. Marketing and data teams tend to struggle with showing ROI, whether because of fallacious expectations, misguided metrics or tenuous causality. There’s a clear need for polymaths at the intersection of both. Someone needs to know both language and large language models, both text-to-image generation and art direction, both strategy and measurement modelling. If not, separate teams will have trouble setting expectations and drawing a through line. For now, this role is so unheard of that I don’t know whether to call it hybrid, generalist or hyper-specialist. They will need to be introduced to organizations through investments in cross-skilling.
The world doesn’t wait
Futurist Rita J. King coined it the Imagination Age, while Peter Diamandis predicted the Fortune 500 may see mass extinction. AI is not the only change driver. Fossil fuels are making way for renewables, and populations are aging. The need for human connection will stay the same, but methods are changing, as they always have. It’s a good time to ask: how are you preparing for the future?
(This article by Sean Cheo first appeared on www.medium.com)