This article originally appeared as a comment piece in The Drum.
The early 21st century has been characterised by an unprecedented erosion in the trust of society's traditional institutions, thus it's never been more important for advertisers to break out of their echo chamber, argues Harriet Kingaby, head of content and insight at BoraCo.
Digital technologies are disrupting the world around us. Shaping societies, worldviews and access to information. But they are developing faster than the codes we wrote to govern them. Planning for the future starts now: never have ethics been more business critical.
The events of the last 18 months have created seismic social shocks and sent trust in business, politicians and the media into freefall. Brexit has divided the UK, while across the Atlantic, Trump’s particular brand of populism has highlighted the contrasting political, social and racial boundaries of the USA.
These schisms breed a cynicism intensified by social media echo chambers. As cries of ‘fake news’ and perceived editorial bias fuel a deep distrust in ‘the establishment’. Brands are identified as part of ‘the establishment’, and seen as remote and self serving: a recent study by Trinity Mirror found that almost 70% of consumers don’t trust advertising and 42% distrust ‘brands’.
Not just particular types of brands either: everyone is being lumped in with everyone else, and trust in business is now inexorably linked to what’s happening in politics, the media and elsewhere.
This erosion of trust is hard to rebuild, particularly when relationships with brands are conducted across so many digital touchpoints. We interact with brands on our phones, televisions, laptops and social media, a variety of contexts that require very different strategies. The Trinity Mirror survey also found that people want to see ‘real world proof’ that brands live up to the promises they make. They must embody their espoused purpose and values, appearing consistent across these touchpoints: from push notification, to billboard, to hiring policy.
In the most simple forms, this means that brands and their agencies must know where their ads are, not just what’s in them. It means they must use formats that aren’t intrusive, work hard to be consistent, and be transparent about the data they’re harvesting. But it means much more than just good marketing: it requires them to be resilient and resolute, to be able to make company-wide decisions that consider the next 10 years, not just the next reporting cycle. And this is vital, as it will be brands, the same ones currently struggling to retain trust, that will be the stewards of one of the biggest changes in human history: the widespread introduction of AI.
Because this rate of change isn’t going to slow down. We are, in 2017, teetering on the brink of a fourth industrial revolution that will be AI-driven. Machine learning will introduce systems that learn for themselves, meaning that the rate of technological advancement could skyrocket. In this rapidly advancing world, decision making will be key.
Seemingly simple choices, such as the industry funding of policing against ad fraud, could set undesirable precedents for the future (such as justice for those that can pay). Arguably, we are already seeing the fallout from the decision by social networks to use marketing algorithms to serve us news. The car industry, designing the a generation of driverless cars, are already asking themselves such complex ethical questions as who their cars would protect or harm in an accident.
The questions that marketers, tech brands and governments are starting to ask themselves are not just ‘can we’? But also ‘should we’? And ‘how should we’?
The decisions we make today are vital, they will dictate the values, and so the reality, of the future we create. Never before have ethics been so business critical.
That’s why we need to start thinking differently right now. To bring together those thinking 10 years in the future together with those thinking about their next quarterly report. If we act swiftly, we can effectively shape the impact of technology on our future, rather than be shaped by it.
We need a network that facilitates knowledge transfer and unites action around the problems of tomorrow. Collaboration around digital ethics will allow us to share knowledge and build frameworks for managing these impacts, whilst innovating for the future. Together, we can avoid falling victim to over-enthusiastic legislation, misinformation and unprecedented risks. Ultimately, it will mean we can proactively create a technology driven world we would want to live in. The time to act is now.