Metaverse Data & Privacy Media

Web2’s advertising risks will be amplified in the metaverse: here’s how to prepare

By Alex Thomas, Senior Brand Safety & Quality Manager

October 18, 2022 | 10 min read

GroupM’s Alex Thomas offers up advice for brands venturing into the new frontier of web3.

Image

With new experiences come new risks

Ever since Mark Zuckerberg rebranded Facebook to Meta in 2021, there has been a surge of interest in the metaverse. Thought leaders and media pundits have clamored to voice their perspectives on the virtual world designed to drive social connection through interactive experiences. Much of the coverage has been positive, building the hype around the metaverse and all the possibilities that come with it. However, as with every emerging technology, it brings with it new challenges. Nick Clegg, president of global affairs at Meta, told The Guardian in 2021 that the metaverse would take 10 years to build and become mainstream.

During this time, there is an opportunity for brands to connect with users and build their reputations. It also means that brands can use this time to get ahead, develop a strategy, build up their value proposition and set up safeguarding mechanisms and community standards before it takes off.

Roles and responsibilities

If the advertising and media industry wants to embrace the metaverse, it needs to be cognizant of the potential difficulties it may hold. For marketers, risks similar to those experienced with web2 will emerge and amplify within the metaverse.

For example, the metaverse is set to be powered by various technologies such as augmented reality (AR) or virtual reality (VR) headsets, haptic vests and artificial Intelligence (AI). These can intensify both helpful and harmful behaviors in immersive and sensory experiences.

As a result, the advertising and media industry must understand the ethical challenges involved and address them from the onset. It has an opportunity to co-create and shape a healthier media-driven future for both brands and consumers alike. While the industry cannot address every aspect of ethics, it can bring more responsibility into its daily actions.

The real question is, what do brands need to look out for and how do they work together to ensure user and brand safety in decentralized worlds?

Risks and safeguards

The risks the metaverse can unveil for people and brands are varied but range from emotional to financial. All the risks brands must deal with in the real world will need to be accounted for in a metaverse environment too, particularly for those that cover commerce, content and human behavior at scale.

The advertising and media industry has a collective legal and moral responsibility to protect people, communities and the planet. Because of this, brands need to consider the following non-exhaustive risks and safeguarding measures:

Data ethics

In terms of data collection, the metaverse will take user data, associated with web2 marketing, beyond contact data by opening the door for the collection of sensitive biometric data. In the web2 environment, marketers use consumer data to create personalized experiences, which is only likely to intensify in the metaverse.

As Mastercard’s chief privacy officer Catherine Louveaux put it in a blog post: “The metaverse will be data collection on steroids.”

The metaverse reveals the collection of personal sensitive biometric data, such as facial recognition and fingerprint technology. The decentralized structure of a metaverse environment will enable users to secure a lot of their personal data, but users – particularly the young – need to be aware of the risks.

When it comes to data collection and the metaverse, it is critical that brands create safeguards to avoid outcomes that would undermine human dignity. For example, if stolen, biometric data could be used to create deepfakes or for hacking into private data. To safeguard this, data processing should be rooted in the principles of accountability, respect for private life and full transparency.

Algorithmic bias and diversity & inclusion

The algorithms that underpin metaverse technology are as biased as the programmers building them. Because of this, there is a danger they could impact how people experience their metaverse realities by favoring and disfavoring users and content. To mitigate this, the teams building algorithms need to define clear criteria and policies on how to identify bias, establish ethical governance frameworks, train teams and hire people from diverse backgrounds.

Sustainability

The technology that underpins the metaverse environment such as cryptocurrency and non-fungible tokens (NFTs) consumes vast amounts of energy. Plus, the new equipment required by the consumer to visit the virtual worlds – such as VR or AR headsets – means more waste. The industry needs to investigate newer (such as proof of stake), more eco-friendly processes to validate blockchain transactions, as well as create hardware recycling programs to support sustainable consumption.

Financial regulation concerns

Cryptocurrencies can be highly volatile and crypto wallets can be hacked. For example, two in-app ads for Crypto.com were banned for not being sufficiently clear that the value of investments in cryptocurrency was variable and for irresponsibly encouraging investing in cryptocurrency through using a credit card. When it comes to unregulated cryptocurrencies and NFTs, to deter fraud brands should apply the basic principles of advertising self-regulation – do not mislead or overpromise, substantiate all claims, disclose risks and be upfront about intellectual property rights obligations.

Human safety

In the metaverse, individuals who are bullied or harassed could experience it with new intensity due to it being a multisensory world. The gamification of reality is attractive to children. However, this enhances the risk of exposing children to explicit sexual content, bullying, harassment and predation. The Centre for Countering Digital Hate discovered that every seven minutes in the VR Chat app there is one incident including minors being exposed to explicit sexual content and bullying and harassment. Grooming is no exception either. Furthermore, a BBC investigation revealed that children can visit virtual strip clubs, where they can witness all sorts of inappropriate behavior for their age. While this discovery is disturbing, these are the same issues children are facing today on the internet. Disinformation, which is already a problem in social media, will remain open to abuse in metaverse environments too.

Combined with deepfakes (that can be created by using biometric data), it is a real risk of damaging the reputations of respected media and government organizations. It could even be used to empower cyber-attacks.

The safety of users needs to be the main priority. Platforms need to make teams accountable, have specific and clearly-defined community guidelines, empower users to report illegal and inappropriate behavior, and publish transparency reports regularly about community guidelines enforcement. Special attention should also be dedicated to the media literacy of children and to the parents’ role in ensuring their children are safe.

Suggested newsletters for you

Daily Briefing

Daily

Catch up on the most important stories of the day, curated by our editorial team.

Ads of the Week

Wednesday

See the best ads of the last week - all in one place.

The Drum Insider

Once a month

Learn how to pitch to our editors and get published on The Drum.

Brand safety

The metaverse is introducing new platforms that have yet to address brand safety, ad fraud or ad measurability. First- and third-party measurement in metaverse environments is still new (with a few in-game provider exceptions), and 3D environments make it challenging to measure viewability. However, brands will need to work through the lenses of the Global Alliance for Responsible Media and IAB TechLab to categorize harmful content and establish clear measurement frameworks for fraud and viewability, and work on human safety as a precondition of brand safety.

What role can the media and advertising industry play to ensure care and responsibility in the metaverse?

The metaverse offers a new creative space to express, interact and co-create. It’s an opportunity to create inclusive experiences and a chance to hire, empower, listen and learn from a diverse body of creators, talent and developers.

The advertising and media industry has a responsibility to fill the governance vacuum, work together and build frameworks and standards for the metaverse – just as it has in the real world.

Marketers can begin to lay the groundwork to support the development of an ethical and safe metaverse by getting started on the following:

  1. Educate and upskill so that they understand the complex technology that underpins emerging virtual worlds.

  2. Collaborate with the intention to build community guidelines and standards across virtual worlds and apply the principles of ethics and safety to product and partnership building to co-create a safer space. The work of the Global Alliance for Responsible Media will be even more important as we aim to replicate and elevate it to accommodate the metaverse’s challenges.

  3. Assess the ecosystem as it evolves. Virtual worlds are changing rapidly. Brands need to monitor the situation and apply the basic risk mitigation tactics and responsibility principles to the opportunities that are already available.

  4. Get ahead. Embedding responsibility and community standards on a platform or virtual world before it takes off is much easier to maintain than trying to catch up.

At GroupM, we are working to embed the principles of our Responsible Investment Framework in our thinking and action in the metaverse by building alliances, products and services with responsibility – not as an afterthought, but as a guiding principle.

Alex Thomas is senior brand safety and quality manager at GroupM.

Metaverse Data & Privacy Media

More from Metaverse

View all

Trending

Industry insights

View all
Add your own content +