5 companies using AI for creative stunts and competitive edge

A room full of marketers drum their fingers on the boardroom table, wondering how to make their next ad campaign a little different. How can you make your brand stand out, offering something delightfully unexpected, deliciously whimsical, or simply bizarre?

The answer, for many, is AI. While the technology’s credentials largely come in the form of automating employees’ workloads to make menial tasks easier, many have found its creative offerings to be more appealing.

Fast-becoming the go-to for an Alice in Wonderland style makeover of your marketing, what other way can brands hope to create the lost sounds of the world’s great leaders, or a chatbot TV ad that makes little to no sense?

Here, we roundup the AI stunts that have hit headlines in the last year for being poetic, macabre, and downright silly.

1. Burger King ads capture attention using AI’s absurdity

AI algorithms can sometimes fail to capture nuance. While lacking human sensitivity and awareness can sometimes be disastrous (*cough* Amazon), other brands have capitalised on its ludicrousness.

Last year, Burger King released a series of ads that were written by a deep learning algorithm after watching 1000 hours of the brand’s previous commercials. The fast-food chain called the project ‘Agency of Robots’, which made fun of the way AI often churns out nonsensical text. The slogans included “tastes like bird”, “The whopper lives in a bun mansion just like you”, with the chicken sandwich described as a “bed of lettuce for you to sleep on”.

Although there’s some debate online as to whether the ads were actually written by humans, the whimsy proved popular, with the ads garnering up to 150,000 views each on YouTube.

2. Huawei completes Schubert’s ‘unfinished symphony’

The final two movements of Schubert’s Symphony No.8 have been incomplete for 197 years – until Huawei finished it with a combination of AI and human expertise.

The tech company used an AI model to analyse the timbre, pitch and metre of the existing first and second movements of the symphony. From here, the third and fourth movements were created with the help of composer Lucas Cantor who arranged an orchestral score from the melody.

The publicity stunt was showcased with a 66-piece English session orchestra at Cadogan Hall in London – an experience attended by celebrities such as Myleene Klass, Erin O’Connor, and Daisy Lowe. While the company’s motivations behind the piece were to prove that AI has a place in modern culture, it also proved an engaging way of highlighting the company’s status in the world of tech, pushing boundaries and focusing on the fun and positive side machine learning could bring.

3. Edit Agency reconstructs famous Christmas Carols

‘If it ain’t broke, don’t fix it’ is not exactly AI’s motto. For a solid AI-stunt campaign, the best way to capture attention is to take something beloved and turn it on its head using fantastical tech. This was the thought process behind marketing agency Edit’s rewritten Christmas carols.

The agency took 1550 lines from 60 famous Christmas carols and entered them into the IBM Watson Tone Analyzer to see how well they could be understood. This then formed five new Christmas carols created purely through AI technology, with illustrations to match.

The AI identified words or phrases that were classified as one or more of the following tones: joy, confidence, fear, anger and sadness. Edit then took the top-scored lines from each emotion and crafted them into five new tunes.

While these carols read like a mash-up of old classics, they somewhat lose their charm once reduced to individual famous lines strung together. “We all know that Santa’s coming, he has a very shiny nose” isn’t quite as catchy as one would hope – but nonetheless, they couldn’t help but be noticed by baffled readers among the stack of traditional agency Christmas cards.

4. Lexus unveils the world’s first AI scripted ad

Luxury vehicle brand Lexus claimed to create the first filmed advert written entirely by AI last year, and shot by an Oscar-winning director no less.

Keeping up with its reputation for ‘pushing the boundaries’ of technology, the 60-second feature, Driven by Intuition, is directed by Oscar-winner Kevin Macdonald and built by technical partner Visual Voice in collaboration with The&Partnership, and with support from the IBM Watson suite of AI tools and apps.

After training the AI with data including 15 years of award-winning luxury adverts, Lexus says the advert draws on data from what emotions audiences respond well to. With the usual sweeping shots of a car driving through a beautiful landscape, and sleek close-ups of wheels on tarmac, the ad doesn’t feel dramatically different from other car adverts. But, it does have an added layer of Blade Runner-esque robotic drama, and more importantly, would you search out a car advert online without its AI shtick?

The ad was released across digital, social and cinema channels in Europe to herald the launch of Lexus’ all-new ES executive saloon this year, adding a layer of the theatrical to a new product launch.

5. JFK’s ‘lost’ speech created in his own voice by machine learning

Eight weeks, 116,777 sound units and 831 speeches is all it took sound engineers to bring to life 2,590 words John F Kennedy never got to speak.

His ‘lost’ Dallas speech, due to be delivered the day he was assassinated in 1963, was pulled together through 0.04 second sound units or ‘phones’, of which there are 40/45 in the English language. The challenge was to make these phones sound natural, as one sound often merges into another.

CereProc, the company specialising in text-to-voice technology, is used by brands to bring characters and products to life, as well as allowing people who are losing the power of speech from motor neurone disease or other conditions to maintain their voice. The company has previously recreated film critic Roger Ebert’s voice, at a time when he had lost his speech due to cancer.

The JFK Unsilenced Project was more of a challenge due to the old analogue recording devices used. In the final 20-minute speech, fewer than half the audio units were beside each other when they were pulled from recordings. Once a database was built, a new computer system was employed to recognise and recreate Kennedy’s oratorical style by learning delivery patterns in historic speeches.

The result is a testament to what machine learning can achieve, not just in terms of recreating the past, but in helping brands build a point of differentiation. It also gives a voice back to those who have lost theirs through illness. Compared with the two months this project took, if people record their voice in time, the technology only needs three to four hours of data to run clearly – giving them the power to maintain their personality and similarly be unsilenced.

Related content

Access full article

B2B strategies. B2B skills.
B2B growth.

Propolis helps B2B marketers confidently build the right strategies and skills to drive growth and prove their impact.