Google Artificial Intelligence The Responsible Marketing Hub With Google

An ethical data framework creates a ‘rock solid foundation’ to build out AI principles

Author

By Jenni Baker, Senior Editor

December 14, 2023 | 9 min read

Sponsored by:

What's this?

Sponsored content is created for and in partnership with an advertiser and produced by the Drum Studios team.

Find out more

Data, media and privacy leaders from ISBA, Deloitte and Jellyfish join Google to share lessons for brands on the value of prioritizing good data ethics and privacy practices.

four people on stage at Google's Privacy Forum

Google, ISBA, Deloitte and Jellyfish speaking at Google's UK Privacy Forum 2023

As the advertising industry advances towards an AI-driven future - which is also piquing the interest of the C-suite - creating an ethical framework for data and getting principles in place creates a “rock solid foundation that AI will sit on top of,” says Clare O’Brien, head of media for the Incorporated Society of British Advertisers (ISBA).

O’Brien was speaking on a panel hosted by Sharon Marshall, director, EMEA partnership solutions at Google as part of its UK Privacy Forum, alongside Zac Faruque, director, risk advisory, data, privacy & analytics at Deloitte and Alex Davies, executive vice-president of analytics at Jellyfish exploring the impact of good privacy practices on business.

As an industry, there’s a responsibility and an opportunity to reimagine a digital advertising ecosystem that is more transparent and respectful of people’s privacy to rebuild consumer trust. When 49% of consumers say they would switch to their second-choice brand if it provided a better privacy experience, there’s a clear business benefit to strengthening customer connections and earning trust.

More than just a number

“It’s principally the advertiser that is in the front line of consumer discomfort with overt targeting,” says O’Brien. “Anyone grappling with corporate data ethics knows this isn’t a simple tick boxing exercise - it’s just not convincing to say your data is safe in our hands. It has to be demonstrated that it’s true and brands need to be able to help people understand why their data is important and what happens to it. That transparency is necessary to begin to build trust and confidence.”

If advertisers are thinking about privacy from the consumer perspective, they will get it right, says Deloitte’s Faruque. “One hour of GDPR learning a year is not enough. Simple questions should act as our moral compass - would you do this with your mum’s data? We must sit back and remember that the consumer on the other end of the mobile/laptop/desktop is a human being and we should respect them as such. They are not just a number.”

There’s a responsibility on brands and the partners they work with across the ecosystem to provide more transparency and education around consent, to back up the promise to the consumer and to help people gain confidence in the modern marketing and advertising ecosystem in which they take active part.

Changing a culture

This requires a cultural shift to back up the promises made to consumers and overcome embedded behaviors in the supply chain to ensure that it’s not just empty promises. That means working with partners who share the same ethical values and respect a consumer’s data to the same degree.

“This collaboration becomes even more important when considering the complexities of the adtech ecosystem,” says Faruque. “It’s imperative to have collaboration from technology, privacy/legal and marketing, both internally and with your external partners - otherwise there can be a lack of understanding and you risk the 11th hour fire drill.”

The panelists suggest embedding the data protection officer and lawyers into marketing teams. “If the lawyers don’t understand data flow processes, it acts as a massive barrier,” says O’Brien. “If they can’t understand how that data is being used, they’re not going to sign off on the promise that says the brand will look after it.”

Simplifying communications

By working closer together, brands have an opportunity to explain data gathering in a way people understand versus words that lawyers want to see in that consent box.

“If more brands can start to explain why consent is being gathered and what happens to the data in their care, they help people to understand what a good approach is and therefore it becomes a brand value,” says O’Brien. “People whose skills are being deployed to write copy about your products/services should also be deployed to ask for consent – that’s an investment that brands need to make.”

Faruque agrees that “where we are seeing companies succeed is where privacy principles are brought together with the customer experience conversation. If we think about consent from the CX perspective, it’s also about choice. Instead of just yes/no nuclear buttons, why not offer choices like topics of interest, frequency of communications, channel, for example.”

Future-proofing measurement

In the early days, clients would be asking how to work around users’ consent choices, but now they are realizing that privacy and building trust is at the heart of measurement, says Davies from Jellyfish. “Data and privacy are so intertwined that they’re often considered synonymous. Privacy can be a really good value driver and clients who are taking it seriously do see better, tangible outcomes. The industry is waking up to this.”

He points to the importance that AI plays in marketing now, noting that “we’re seeing the growth of AI from every direction [but] AI is only as good as the data it receives - with data signals becoming weaker due to regulatory and technical restraints, signal quality has to go up. I cannot express how important building and utilizing your consented first-party data is in this era of AI.”

He suggests three steps to onboarding in a privacy safe way:

  • Building trust through consented first-party data and understanding consumers better with more data coming through various data sets across the business
  • Respecting user consent choices - if they say no, think about leveraging conversion modeling to fill gaps while respecting a user’s consent choice
  • Utilizing tech built with privacy in mind - for example, Enhanced Conversions which uses hashing (which is irreversible) which does lead to better results.

Foundations of trust

But what’s clear is that good privacy practices are the responsibility of the whole organization, not just one team or individual. Brands are recognizing that privacy is a cross-functional responsibility - but that is currently more of an ambition than an actuality.

“Culture is about understanding, it’s also about change management,” says O’Brien. “With something as valuable as brand reputation at stake, we encourage brands to think from the top down. The key to all of this is cross-functional collaboration across the organization. The only way to get that effectively is top down, it doesn’t work from the bottom up. An upward or sideways conversation might make incremental improvements to understanding between colleagues but it’s not going to embed a culture in an organization.”

While data and privacy regulations may change region to region, the corporate sense of it shouldn’t. By establishing a set of principles that brands believe in and act on in everything they do, the better placed they are to embrace data and digital ethics to improve consumer trust.

Suggested newsletters for you

Daily Briefing

Daily

Catch up on the most important stories of the day, curated by our editorial team.

Ads of the Week

Wednesday

See the best ads of the last week - all in one place.

The Drum Insider

Once a month

Learn how to pitch to our editors and get published on The Drum.

Google Artificial Intelligence The Responsible Marketing Hub With Google

Content created with:

Google

Google is committed to helping businesses thrive in a privacy-first world. The technology giant works with thousands of businesses and agencies to help them prepare...

Find out more

More from Google

View all

Trending

Industry insights

View all
Add your own content +