Martech Technology

Existential risks aside, AI machines are just as sexist & racist as humans

By Kat Thomas, Founder & Global Executive Director

March 31, 2023 | 5 min read

As experts raise concerns for the safety and societal implications of artificial intelligence, an experiment by earned media agency One Green Bean has showcased yet another issue with AI - the machines are just like us.

One Green Bean staff with their AI generated images based on job title

Recently we did a couple of little experiments that got a lot of people talking. Won’t lie; that was the intention because right now, our industry is losing its shit over artificial intelligence. It’s the shiniest new toy, dominating endless conversations in agencies around the world – we’re all awe-struck by its potential to elevate and enhance the creative process whilst obsessing over the darker side… the risk that its smarts will eventually put talented and highly trained specialists prematurely out to pasture. Reactions both internally and with our clients have ranged from astonishment to panic …to mental resignation that the robots have finally arrived.

Take ChatGPT – a platform that can knock out highly crafted, considered copy in seconds. Often indistinguishable that it was machine-generated. Fifteen-year-old me churning out 100+ cover letters in the quest for a foot in the door would have cried tears of joy with that kind of shortcut. As would 18-year-old undergraduate me, who didn’t really grasp the work hard, play hard Ts&Cs associated with studying.

Take Midjourney – an AI image generation tool that takes inputs through text prompts and uses a machine learning algorithm to produce completely unique imagery. The better the descriptor for what you need, the better the image is. The gasps from our creative studio - who are often tasked with bringing to life ideas that don’t exist yet - weren’t quiet. Typically, they’re trying to visualise innovations, partnerships, collabs, activations, stunts, installations and more… starting completely from scratch. The time-saving impact is massive, and the shift in craft from ‘create’ to ‘post create’ is already evident. AI can reduce the ‘cognitive load’ to create, essentially presenting you with a rough first draft to finesse.

All a bit of a game changer, right? Well, the more we used it, the more we started to notice the biases associated with the imagery it was generating. To the point where it was both laughable and a bit uncomfortable. I asked Midjourney to generate an image of what I should look like as an ECD, along with images of a managing director and a regional PR lead. The platform defaulted to images of men every single time.

Then we asked Midjourney to visualise the top 20 highest-earning professions in the UK. You can probably guess how that went … 88% of the images were of men. Most of the time, they presented with exaggerated masculine features. Chiselled jaws, overtly muscular… gym-bunny-esque. Meanwhile, women were only ever presented in stereotypical job roles. And even in professional situations, they were often sexualised with larger-than-anatomically-feasible boobs, unbuttoned tops and glossy pouting lips.

Why? It’s simple really - these tools scrape imagery and data points from up to five billion sources across the web for each search. They work with what’s already out there in the ether – they hold a 'meta-phorical' mirror up to the inequality of today’s world. Worth saying too; they, not only playback the gender bias, but the racial bias is also just as stark. It's real… machines can and do discriminate.

It's not an easy fix; some people think the programmers behind these platforms need to ‘programme differently,’ but that’s not really the solution. Society ultimately needs to push harder for equality, so these tools reflect that employers have a huge role to play. There’s of course, a responsibility of big tech to consider its role in creating ethical and inclusive tools that respect human rights. Because right now, they recommend content that not only reflects the world’s biases but the individual user’s biases too. Which doesn’t feel like the societal advancement I expect from the droids that will one day control us.

Kat Thomas is the founder & global executive director of earned media agency One Green Bean.

Martech Technology

More from Martech

View all

Trending

Industry insights

View all
Add your own content +