Artificial Intelligence Data & Privacy Media

The issue at hand is much bigger than privacy – it’s about all data-enabled harms

By Arielle Garcia, Chief privacy officer

May 10, 2023 | 9 min read

As part of The Drum’s latest Deep Dive, The New Data & Privacy Playbook, Arielle Garcia, chief privacy officer at IPG-owned media agency UM Worldwide, argues that it’s high time for the ad industry to widen its lens on what ‘privacy’ means.

face in data

/ Adobe Stock

As the fifth anniversary of the EU’s General Data Protection Regulation (GDPR) approaches, I’ll be celebrating my milestone: my 10th year of being a part of UM. It’s also my third year of having the privilege to contribute to The Drum’s Data & Privacy Deep Dive, sharing my perspective on the opportunity and imperative of a people-first paradigm to advertising data collection and use that delivers value to consumers and sustainable growth to brands. While today I hope to do the same, a special milestone calls for a more personal approach.

10 years ago, I planned to get a steady job, go to law school at night and move on to a career where I could contribute to a better world – likely in the non-profit sector, perhaps at an NGO supporting human rights.

As graduation neared in 2018 – along with GDPR’s effective date – I realized that I was exactly where I needed to be. I had become enamored with UM, having had the opportunity to witness how great work can shape culture and add value to people’s lives, and to experience the privilege of being a trusted steward of our clients’ brands.

I also realized, with Cambridge Analytica making headlines daily, that our industry lives squarely at the intersection of technology and society, setting the norms, sustaining the market dynamics and supporting the incentive structures that underpin the information economy. I recognized that what we do and don’t do, what we accept and expect from our partners, can and does have a profound impact on society, democracy and individual lives and liberties.

I was stricken by the reality that while all aspiring lawyers are required to learn about ‘professional responsibility,’ no such universal understanding of fundamental duties or ethical practices existed to guide our industry’s collective behavior.

With the industry at a critical inflection point, what more important opportunity could there be than to contribute to a more responsible future – to drive towards new standards that contemplate the interests of all stakeholders and the impacts on people and society?

Now, what does this all have to do with privacy? Surely, the key to societal well-being and institutional trust cannot be found in a cookie banner. To this point, the Federal Trade Commission (FTC)‘s former commissioner Rebecca Slaughter, in her 2021 Privacy Con opening remarks, challenged attendees to “reject ‘privacy’ as the animating framework.” She urged them to consider that risks – including algorithmic bias, misinformation, youth safety, personal autonomy and civil rights – are far broader than traditional notions of ‘privacy’ would suggest. Slaughter suggested instead that we should think of these challenges as “data abuses.”

Indeed, in an increasingly data-driven world, any harm can be data-enabled. This construct better captures the scope of the challenge at hand – one that requires the right balance between privacy, competition and consumer protection to create a fair and healthy ecosystem for all – and one that ‘notice and choice’ alone are simply insufficient to address.

In a connected world, with so much power concentrated in the hands of big tech, is choice for consumers or marketers much more than an illusion? Are people being asked a fair question?

When studies highlight that consumers understand and appreciate the ad-funded web, touting that people are happy to share data for personalized experiences, these surveys are not asking people if they’re okay with the possibility that their browsing data or mobile location might be used to track their participation in a protest, inadvertently reveal their sexual orientation or prosecute them for seeking reproductive healthcare services. People are not asked if they are comfortable with their engagement data being used to construct a profile that can be exploited to promote specific messaging anticipated to incite anger to drive political polarization or to deliver content to their pre-teen child that exacerbates her body dysmorphia.

Suggested newsletters for you

Daily Briefing

Daily

Catch up on the most important stories of the day, curated by our editorial team.

Ads of the Week

Wednesday

See the best ads of the last week - all in one place.

The Drum Insider

Once a month

Learn how to pitch to our editors and get published on The Drum.

Against this backdrop, it becomes clear that the FTC’s focus on tracking pixels and the recent surge in privacy-focused litigation over the Video Privacy Protection Act is not about a specific technology, but about fairness and preventing inadvertent or unwanted disclosure of personal information.

There are irrefutable and mutual benefits to businesses and to people alike where data is used to deliver value, enhance experiences, educate, empower or enable efficiency. But there are also important responsibilities that must be embraced, and guardrails that must be established for harm to be mitigated while benefits are preserved.

For this reason, I’d like to make the same request that Commissioner Slaughter made of her audience – particularly as the acceleration of AI development and adoption in the absence of governing ethical standards runs the risk of driving existing harms at scale, even under the guise of enhancing privacy.

We are once more at a future-defining inflection point. Let us look beyond the narrow notion of ‘privacy’ and commit to making meaningful progress towards mitigating all data-enabled harms.

Let us align on a common set of principles to guide our thinking, our conduct and our products as an industry – one that contemplates the impact on people’s lives, on cultural norms, on society and the sustainable growth of our businesses.

Let us finally stop skirting the edges of what is personal versus de-identified information, arguing about the need for sensible regulation and reinvesting that energy into building a better future.

Let us paint the lanes that separate relevance from manipulation and let us teach practitioners on all sides of the ecosystem how to drive on the right side of the road.

Let us work together in driving practical alignment on what ‘sensitive data’ is, and stop collecting or using it in ways people didn’t ask for, instead of scoffing at unworkably broad language.

Let us agree, demand and adapt to a reality where youth data is not monetized, and their safety and well-being are never compromised for commercial gain – and let’s focus instead on age-appropriate strategies to earn our place in the lives of younger generations.

And then, let us use these learnings in reinventing the way that we connect with audiences of all ages, with a commitment to respect for individual autonomy, preference and expectation, underpinned by authenticity and shared value.

After all, isn’t that what we all came here to do in the first place?

Arielle Garcia is chief privacy officer at UM Worldwide. To read more from The Drum’s latest Deep Dive, where we’ll be demystifying data & privacy for marketers in 2023, head over to our special hub.

Artificial Intelligence Data & Privacy Media

More from Artificial Intelligence

View all

Trending

Industry insights

View all
Add your own content +