Back in the good ol’ days, climate disinformation came from high carbon companies with vested interests. Flowers spewed forth from chimneys on billboards, and coal was clean. In today’s weaponised and AI-driven information environments, it’s become even more sophisticated.
Right now, the web has a single reigning business model: digital advertising, a $300 billion industry annually.
This business model has upsides, like “free” access to content, tools, and platforms. But there are also steep harms — and consumers bear the brunt of them. Today’s AI-powered targeted advertisements can spread misinformation and disinformation. They can reinforce societal biases and discrimination. They can increase surveillance. And they can even harm the environment.
It may be possible to reform digital advertising to mitigate these harms, but it is also possible to introduce mainstream alternatives?
A webinar for 2020 Mozilla Festival with Nathalie Marechal of Ranking Digital Rights, Akintunde Sultan of DevCareers, and Janice Waite of Mozilla. Listen here.
….As someone who has worked in advertising, the idea of receiving more tailored ads when I surfed the web was absolutely fine by me.
But that is obviously a very naive view of the transfer of data that happens when we’re online. And it comes from a place of not having to worry about the state getting hold of data about me and what they might do with it. It comes from a place of not having to protect special characteristics about myself that might expose me to discrimination. And it comes from a place of being in Europe and having the General Data Protection Regulation (GDPR), which protects that transfer of data. …
Digital advertising is a booming industry: worth over $300 billion in 2019 alone. It’s also the primary business model sustaining the internet, humanity’s most important communications tool. But as AI-powered advertising grows more pervasive and sophisticated, it is doing so without guardrails. There are few rules to ensure it doesn’t surveil, misinform, or exclude consumers. If the industry doesn’t undergo major reform, these problems will only grow more pronounced.
Twenty years ago, digital ads were little more than online billboards — pop-ups that didn’t know who was seeing the ad, or why.
But today’s AI-powered digital advertisements are exponentially more sophisticated. The technology behind these ads can profile consumers and segment them into precise audiences, or make assumptions that cause discrimination. Or serve ads based on the emotion detected on their face; even as they sit in their own homes. …
A bibliography for my recently published report can be found below. All sources were accessed between 1st February and 25th May 2020.
Aapti Institute (2020) The Aapti Podcast: Discussing Surveillance with Divij Joshi, Mozilla Tech Policy Fellow. Aapti Institute. <https://soundcloud.com/user-951817091/the-aapti-podcast-discussing-surveillance-with-divij-joshi-mozilla-policy-fellow>
AB Newswire (2018) EMOTION DETECTION AND RECOGNITION MARKET INSIGHTS, EMERGING TECHNOLOGIES, GROWTH FACTORS | NORTH AMERICA TO DOMINATE THE GLOBAL INDUSTRY OVER 2023. Posted on May 10, 2018, AB Newswire. <https://www.abnewswire.com/pressreleases/emotion-detection-and-recognition-market-insights-emerging-technologies-growth-factors-north-america-to-dominate-the-global-industry-over-2023_214918.html>
Achara, J.P., Ács, G., & Castelluccia, C. (2015). On the Unicity of Smartphone Applications. WPES ‘15.
Originally posted on the blog of The Conscious Advertising Network.
The recent Facebook boycotts pose a headache for charities. On one hand, Facebook is a key fundraising tool, on the other, it stands accused of failing to take action on hate speech, setting back civil rights, and posts on the platform are implicated in genocide in Myanmar. Not to mention removing fact checks from climate denial.
Over 1100 companies worldwide have pulled millions of dollars in advertising from the social network, with brands from Coca-Cola to Ford, Unilever and Disney demanding that Facebook monitor hate speech more aggressively. …
“I don’t think that listening is the answer. Brands need to listen, act, and be proactive when it comes to world events.
It shouldn’t take the death of a black man to make brands think about board diversity. It shouldn’t take being called out on social media for appearing next to terrorist content for brands to take control of their ad spend, and yet that is the world we are in.”
Read more from me, Nicola Kemp, Chomoi Picho-Owiny and Lydia Hoye on CreativeBrief
BoraCo is a small consultancy, consisting of sustainability, risk and technology specialists working with organisations building tomorrow’s world. We have witnessed, first hand, the considerable and hard to forsee effects which can be brought about through apparently small, but interrelated risks associated with digital advertising. Although we welcome the desire to create and develop legislation which ensures that AI is ‘trustworthy’, we have several concerns about the current white paper out for consultation by the EC.
This story originally appeared on Creative Brief Bite in September 2019. It was written by myself and Jake Dubbins for the Conscious Advertising Network.
Ask most CMOs if they want their advertising to appear next to hate speech, or to fund damaging fake news, and the answer would be an emphatic, NO. Ask them whether they’d be happy for half of their programmatic ad spend to be lost to fraud, effectively lining the pockets of shady, criminal organisations, and the answer would be more emphatic still.
Yet, that’s the state of the advertising industry today. The ad money that we insist should be helping us to reach customers in the moments that matter, is not fulfilling its promise. …
Advertisers don’t always know they’re funding climate denial, hate speech and some of the worst content on the web. But it is happening right now. In fact, if you’re not using some form of brand safety software and a trusted partner list at the very least, I can guarantee that somewhere, in some dark corner of the web, your advertising is appearing somewhere you wouldn’t want. (Even if you are, it’s still likely, but that’s another podcast).