In the red corner, Big Data! Trained by Google, Facebook, Amazon and Microsoft – current holder of the World Profiteering Title belt with $1 trillion wins and no losses!
In the blue corner, Ethical Business! Trained by Your Nervous Conscience – a plucky challenger with a couple of wins and too many losses. Ding-ding, Round 1!
Big Data: “extremely large data sets that may be analysed computationally to reveal patterns, trends, and associations, especially relating to human behaviour and interactions.”
Ethical Business: “business demonstrating respect for key moral principles that include honesty, fairness, equality, dignity, diversity and individual rights.”
We’re fascinated by data. It enables the shift from instinct-driven storytelling to insight-driven storytelling. The danger is that we’re having so much fun that we get dragged along in the wake of whatever project we’re pursuing.
To paraphrase Jeff Goldblum’s character in Jurassic Park, we’re often so preoccupied with seeing if we can do these things, we don’t stop to think if we should.
Whether it’s the fact that Google are building cities, the increasing focus on Block Chain Technology, or the terrifying implications of China rating its citizens based on their internet habits (with technology which has been sold to and deployed in an Australian state which exclusively holds Aboriginal children in its youth prisons), the conversation around Big Data is shifting. Many have been justifiably suspicious of huge centralised data sets since the early days of the internet, and many more since the rise of MySpace and the socially networked society. Once upon a time people avoided revealing their true identity on social media; “I wouldn’t give my data to corporations”.
The decision is now out of our hands – the corporations took it without your permission. Contextual data pins you down, and your data isn’t yours. From online trends influencing U.S. election results in 2016 to Uber’s driverless car projects, data is bearing fruit in very profitable ways.
We recently completed a contract for a large multinational with impeccable ethical values. A key aspect of the campaign was creating a three-way link in the minds of consumers between:
- a major supermarket
- the most popular ethical products available
- a company who certifies those products for an ethical supply chain
The campaign was multifaceted, fun and creatively rewarding (the money was okay too), but the part I want to talk about it is Mobile Geofencing. In the most basic terms, this involves three steps:
- target consumers (by demographic, interests and habits)
- encourage them to do what you want (e.g. buy something)
- create a virtual fence around the place you want to capture them (while timing the campaign around when they’ll likely be there)
It’s a relatively new technology and not a lot of businesses are across it, nevermind in the ethical sphere. We decided to leverage it for this campaign, and we achieved excellent results.
Here’s an example of how it works.
Scenario: Craft Beer Barry
It’s evening, work has just finished and Barry’s browsing his phone as he leaves the office. We’ve set up a contextually targeted ad (towards “craft beer drinkers with disposable income”), geofenced within a 1km radius around a local boutique bottle shop.
Barry’s office is only 500m away, so he spots our ad for a new limited edition fresh-hopped IPA. He’s a bit thirsty after a long day and it’s been a tough week. He’s having some mates over after work tomorrow, so why not get some drinks in tonight and save himself a trip to the bottle-o tomorrow? He loves the brewery because they make good beer, plus it’s organic, carbon neutral and local. Commonly available 3rd-party data let’s us know that he’s just been paid for the past fortnight.
He drops into the bottle shop and picks up a couple of sixpacks, two bottles of wine and a nice whiskey from Tassie on special – he’s feeling good and doesn’t think twice about dropping half a day’s wages on it.
We don’t apply this technology to alcohol sales. We could, but we don’t. Others do. Plus gambling, ciggies, sex and other behaviours that people find themselves addicted to.
Who doesn’t like playing with a virtual crystal ball? It makes you feel like a wizard. For nerds like us, data is fun. It’s also totally creepy and has vast potential for unethical behaviour. In the case of our real life client, our campaign was a soft sell – we accessed consumers via mobile devices when they were most amenable to our message (some might say vulnerable), at the point when they were closest to our target supermarkets, with creative assets which encouraged them to go and make an ethical spend.
That’s okay, isn’t it? We’re the goodies, so it must be, right?
The campaign performed excellently – we achieved superb ROI and increased sales significantly. We measured how many people saw the ad, how many people clicked it, how many of them visited the stores, and – by collating our data with the supermarket’s – a solid estimate of how many purchased a product. Great data, solid insights.
And that’s just the beginning – we mined data from over 10K individuals, and we’ll retarget them, with a way higher CTR. We’ll put cookies on all their devices, learning more about their habits every day with our little pixels.
Not so bad, right? Sure, it’s maybe a little creepy, but it’s legal, and we’re the good guys. It’s for the Greater Good. But is it right? We’re encouraging consumers to choose an ethical alternative over their usual non-ethical purchases. So far so good. But where do we draw the line?
- Would you use data to sell ethical products? Sure!
- Would you use data to sell other products? Maybe…
- Would you use data to sell ideas? …that’s the world we’re in now.
So how do we decide whether we have permission to manipulate data to influence peoples’ decision making?
I’ve love to hear what your limits are. I’m still figuring mine out.