Google Terrorism Facebook

Did publishers and social media giants have a part to play in the Christchurch terror attacks?

Author

By Shawn Lim, Reporter, Asia Pacific

March 22, 2019 | 11 min read

In the aftermath of the Christchurch mosque terror attack, brands like ASB, Burger King, Spark and a host of other big brands are planning or have pulled their ad spend from tech giants like Facebook and Google after the terrorist live-streamed his massacre of 50 people on their platforms.

While Facebook and Google have suffered the brunt of live-streaming the attack, the mainstream press in Australia and the United Kingdom have also been slammed for ignoring brand safety by publishing the footage. Experts believe this is fair as publishers have been guilty of using the footage to drive traffic.

“Traditional media companies should be more rigorously policed than Facebook and Google. Social media, by definition, is unfiltered, which means anyone can upload anything and are not subject to editorial oversight,” Alex McKinnon, the morning editor at Schwartz Media tells The Drum.

He also argues traditional news outlets are supposed to balance the public interest in newsworthy stories with their obligation to report responsibly, in ways that do not actively cause harm.

“Christchurch exposed how many media companies are unwilling or unable to enact basic editorial standards. The Australian version of the Daily Mail didn't just quote extensively from the Christchurch terrorist's manifesto; they made it publicly available for download on their website,” McKinnon explains.

Gavin Ellis, the former editor-in-chief at the New Zealand Herald and now an independent media consultant, points out no New Zealand publishers have carried the video apart from a brief clip showing the gunman’s face before the shooting, which appeared very early in some coverage and was quickly taken down.

This is because the New Zealand film censor has declared the entire video to be ‘offensive material’ and it is an offence to publish it there.

He notes a 22-year-old man has already been charged with the offence, which means the use of this material by any media to drive traffic to their site is reprehensible and deeply offensive to all New Zealanders, not least to its Muslim community. He says Google and Facebook make a pretence of being no more than common carriers of material posted by others.

“The New Zealand Prime Minister called them ‘publishers’ in Parliament yesterday and she is right — major social media platforms must carry the same obligations that news media must shoulder. It is not a matter of publishers being like Facebook and Google. It is a matter of Facebook and Google being like publishers,” Ellis tells The Drum.

Agreeing with Ellis, Simon Birkenhead, the chief executive of KPEX, a programmatic online advertising consortium owned by NZME, Stuff, TVNZ and Mediaworks, adds unlike publications in the US and Europe, most New Zealand news publications did not even name the shooter until after he had appeared in court.

In addition, he points out all publishers within the KPEX alliance immediately paused all advertising on news articles relating to the Christchurch shooting as soon as the news broke.

“Therefore, while they were covering the story, there was no advertising appearing alongside this content,” he says to The Drum. “This is a very different approach to that of Facebook and YouTube, who continued to display ads alongside the shooting content, and ads continued to run on Google Display Network publishers even though many of these may have been showcasing the shooting video to drive traffic.”

Is it fair to just blame Google and Facebook?

McKinnon admits that regulating social media effectively while preserving people's privacy and freedom to disseminate information is a tricky needle to thread as over-regulation creates its own problems, and no one wants a worldwide equivalent of the Chinese firewall.

However, he argues Facebook, Google and others have repeatedly shown they are incapable of regulating themselves effectively with the spread of hate speech; the endless breaches of privacy; the sharing of personal information with corporate and government third-parties.

Ellis points out that what played out after the events of last Friday shows that both Facebook and Google contributed materially to the spreading of hate. “They must put their houses in order or they must be compelled to do so. They are not the victims here. The victims are 50 good people, their families and the nation that grieves for them,” he says.

Birkenhead, meanwhile, believes it is fair that all media publishers are held to account to the promises they have made to their clients. He says other publishers adhere to these promises and sacrifice ad revenues when it is the right thing to do.

“Facebook and Google have shown they are unwilling, or unable, to keep the promises they have made around unsafe advertising content and the industry needs to raise the pressure on these companies to comply with both their own promises and the requirements of their clients,” he explains.

When asked to comment, a Google spokesperson tells The Drum since Friday’s horrific tragedy, it has removed tens of thousands of videos and terminated hundreds of accounts created to promote or glorify the shooter.

The spokesperson points out the volume of related videos uploaded to YouTube in the 24 hours after the attack was unprecedented both in scale and speed, at times as fast as a new upload every second.

“In response, we took a number of steps, including automatically rejecting any footage of the violence, temporarily suspending the ability to sort or filter searches by upload date, and making sure searches on this event pulled up results from authoritative news sources like The New Zealand Herald or USA Today. Our teams are continuing to work around the clock to prevent violent and graphic content from spreading, we know there is much more work to do,” says the spokesperson.

Facebook, meanwhile, declined to comment and instead pointed The Drum to a post that it had published on its blog.

What are the roles of marketers and the media in this debate?

ASB, Burger King and Spark reacted after the Association of New Zealand Advertisers (ANZA) and the Commercial Communications Council (Comms Council) called on social media platforms to prevent such live-streaming.

In a joint letter by ANZA co-chief executives Lindsay Mouat and Paul Head, they called on advertisers and agencies to consider suspending advertising on Facebook until its live streaming functionality is either taken down or sufficient controls are put in place. They also urged advertisers and agencies to put this topic on the agenda at an executive level within their organization, and petition Facebook for change.

Finally, they also asked agency and client communities in their own countries to work together and with their own industry associations and government regulators to apply pressure to bring about change.

This is not the first time they have pulled advertising from social media, says Ellis, as earlier this year, Spark pulled its advertising on YouTube over concern about paedophile content targeting children.

To understand why brands are pulling spend from Facebook and Google instead of publishers, McKinnon explains apart from the obvious and horrifying prospect of having an ad for their products run as pre-roll to a live-streamed mass murder, brands recognize the widespread public disgust at how tech companies have allowed the spread of hatred and violence.

He also called on ad agencies, which have been reluctant to come forward and talk about brand safety and hate speech, to put pressure on Facebook and Google.

WPP ANZ declined to comment when asked by The Drum, while Omnicom Media Group and Dentsu Aegis are yet to respond.

“Ad agencies have been at the mercy of the big social media companies for a while as they need to reach eyeballs and there's increasingly nowhere else to go,” he says. “Agencies need to remember they have the power to demand greater ethical standards and regulation of social media platforms would be a good way to exercise it.

Birkenhead lists other high-profile examples, like terrorist videos from Islamic State and racist commentary from extremist groups, from a long list of unsavoury pieces of content published and shared on Facebook, YouTube, and Google Display Network publishers.

He says this means marketers' confidence in Facebook and YouTube as safe platforms for their ads to show is further reduced. “The Christchurch shooting was likely the last straw for brands whose trust in these platforms has been repeatedly challenged over the last few years,” he adds.

Birkenhead also believes journalists have a role in highlighting to marketers that the promises about brand safety being made by certain publishers are being broken and raising questions in the industry around whether this is acceptable. Given the scale of advertising spent with the global tech platforms, he says this is a very valid question to be raising.

“Other news publications (in New Zealand at least) sacrificed revenues to prevent ads showing against any content related to the Christchurch story,” he reiterates. “But the tech giants have been repeatedly shown that they have not sufficiently changed their practices to prevent the distribution of objectionable, offensive or illegal content, or the display of ads alongside this content.”

He continues: “Additional pressure from the media and chief marketing officers may be required to keep this issue top of mind for these companies' senior executives and force them to finally change their business models.”

However, while publishers like News UK-owned The Times has in recent years been an influence in advertisers pulling spend because of stories it's written, McKinnon says it is a weird situation where the publishers are right, but for the wrong reasons. He agrees Google is a monopoly that should be broken up but notes that News Corp wants that outcome so it can remove a potential rival and swoop in on its territory, particularly in Australia.

“Domestically, News Corp has recommended Google be broken up and its advertising arm separated, as it has been acting as "a monopoly",” he says. “Murdoch has a clear financial interest that he's using his reporters to further. If it's not unethical, I don't know what is.”

Are all media treated equally?

McKinnon says publishers can and should be partnered with to improve their reportage if they express a willingness to do better. That said, he notes media organizations like News Corp have built their business models on actively spreading and promoting inflammatory, hateful and bigoted ideology.

In addition, he points out there are many media companies with deeply flawed practices regarding the reporting of terror events, Islamophobia and right-wing extremism. Some examples include articles and coverage often riddled with basic errors, failure to check or flag the backgrounds of the right-wing figures given exposure, or actively perpetuating inaccurate and harmful misconceptions about Muslim people and non-white people more broadly.

“They have no interest in doing anything differently because their profits and profile depend on the hatred they stoke. There's a growing awareness that there's no point in trying to engage constructively with media outlets that approach these issues in bad faith,” he explains.

Ellis accuses the social media giants of behaving as if they are a law unto themselves. He says they are adept at utilizing the inadequacies of law – jurisdictional limits, for example – to avoid responsibilities that news media organizations have long accepted. He feels it is time for the international community to collectively address the significant issues that are constantly being thrown up, not least in relation to hate speech and extremism.

“The events of last Friday and the live-streaming of this awful crime may not be the first time that extremists have utilized social media to spread their poison, but it may be the tipping point that leads to something being done about it,” he says.

Brands should insist on being provided brand safe advertising environments across all publishers they advertise on, says Birkenhead.

The Christchurch atrocity is one that could finally force brands, agencies, media companies and technology businesses to start a more nuanced and sustainable conversation about the future of brand safety, media funding and the responsibility of media to not spread hate.

Google Terrorism Facebook

More from Google

View all

Trending

Industry insights

View all
Add your own content +