Author

By Audrey Kemp, LA Reporter

January 26, 2024 | 4 min read

Child safety advocacy organization Heat Initiative accuses the tech giant of negligence in preventing the spread of child sexual abuse material (CSAM) in a new short film set to run during the NFC Championship this weekend.

Heat Initiative, a nonprofit organization dedicated to encouraging technology companies to detect and eradicate child sexual abuse materials (CSAM) from their platforms, is launching another critical campaign aimed squarely at Apple.

Slated to run during the NFC Championship on Sunday, the new ad, titled ‘Rotten,’ accuses the tech giant of inaction in removing harmful imagery and materials on iCloud.

Apple didn’t immediately respond to a comment request from The Drum.

In the ad, viewers hear the accounts of real child sexual abuse survivors whose content is currently being shared on iCloud. “I was three months old when it started,” one survivor says. “The images of me have been found over 36,000 times. They’re on iCloud now.”

As survivors’ voices are heard, an apple rots in a child’s hands. The ad closes as they proclaim, “Apple could help us. But they don’t.”

“Apple could easily prevent known child sexual abuse images and videos from being stored and shared on iCloud. Yet the company has chosen to allow it,” Sarah Gardner, founder and chief executive of Heat Initiative, tells The Drum. “This has devastating real-life consequences for survivors of childhood sexual abuse.“

Gardner explains that the organization chose to use the image of a rotting apple to “symbolize the destructive and sickening consequences of Apple’s inaction.”

The new ad, created with The Soze Agency, is part of the Heat Initiative’s $2m Apple call-out campaign. In September 2023, the group released digital ads on Politico and other websites frequented by policymakers in Washington, and placed outdoor ads throughout San Francisco and New York that said, “Child sexual abuse material is stored on iCloud. Apple allows it.”

The campaign underscores a yearslong debate over digital safety vs. digital privacy, colloquially known as the “Crypto Wars.” In recent years, arguments for thwarting encryption have shifted from fears of terrorists chatting in secret to child predators evading police scrutiny, according to The Intercept.

Apple’s privacy protections have long been a critical part of its sales pitch to consumers. In 2021, Apple killed its plans to develop an iCloud CSAM scanning feature after concerns were raised about how such a tool could compromise the privacy and safety of all iCloud users.

That same year, Google made more than 875,000 reports and Facebook, 22m, for potential child sexual abuse imagery on their platforms and servers. Apple made just 160 reports, according to data from the National Centre for Missing and Exploited Children (NCMEC).

In August 2023, Erik Neuenschwander, Apple’s director of user privacy and child safety, shared the company’s response to Heat Initiative’s previous accusations with Wired: “Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it,” he wrote.

“Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit. It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”

The ad is slated to premiere in the Bay Area – Apple’s home turf – during the NFC Championship and coincide with the 40th anniversary of Apple’s iconic Macintosh Super Bowl ad, in an attempt to get the firsthand stories of survivors heard by Apple’s decision-makers. It will then be amplified in the coming weeks by a robust digital and social media plan.

Interested in creative campaigns? Check out our Ad of the Day section and sign up for our Ads of the Week newsletter so you don’t miss a story.

Creative Works PSA Privacy

More from Creative Works

View all