‘Unquestionably it’s making hate worse’: Frances Haugen’s testimony on Facebook
At Frances Haugen’s testimony to the UK parliament, the Facebook data scientist-turned-whistleblower reiterated that engagement-based ranking algorithms incentivize the creation and dissemination of extremist content. Speaking to a select committee looking at the mooted Online Safety Bill, she noted that Facebook was amplifying a societal problem and making it a far greater issue for the wider population.
Frances Haugen testifying to MPs / (C) UK Parliament 2021/ Annabel Moeller
She said: “We need to think about where we add selective friction to the systems so that they are safe in every language,” rather than relying on AI, which has proven to be opaque to outside observers and also fallible.
Haugen also noted that a lack of transparency from Facebook is hindering the ability of potential regulators and analysts to assess whether the company is acting in the public’s best interest. She believes that even a basic requirement to publish a list of the experiments Facebook undertakes would help. “If we had that data we could see patterns of behavior and see whether or not you have effective contingencies.”
She also stated that Facebook’s current lack of transparency around segmentation and analysis leaves people open to harm, using the example of groups and content centered around self-harm as an example.
"Facebook has been unwilling to accept even little slivers of profit, being sacrificed for safety. And that's not acceptable" - whistleblower @FrancesHaugen tells Online Safety bill committee at Westminster— Rory Cellan-Jones (@ruskin147) October 25, 2021
Haugen said that the current system within Facebook is not set up to reward consideration of personal safety. “The person who can move the metric by cutting the most corners is good. There’s no incentive internally ... you will not get rallied around for help because everyone is underwater.
“Safety is a cost center, it’s not a growth center...”
She also noted that advertising, as just another form of content on Facebook, is as susceptible to that engagement-based optimization as any. “I’m extremely concerned about paid-for advertising being excluded because engagement-based ranking impacts ads, as much as it impacts organic content.
“We have seen that over and over again in Facebook’s research, it is easier to promote people to anger than to empathy or compassion, and so we are literally subsidizing hate on these platforms. It is cheaper substantially to run an angry hateful divisive ad than it is to run a compassionate, empathetic ad.”
That adds to the concerns and warnings from groups such as IPG, which have warned of the potential brand damage that can result from Facebook’s opacity.
Existential crisis: How Facebook's own employees struggled with the very technology that made it so successful - and harmful. The iconic Like button, Groups, Shares, Pages, and Newsfeed..
Existential crisis: How Facebook's own employees struggled with the very technology that made it so successful - and harmful.
The iconic Like button, Groups, Shares, Pages, and Newsfeed..October 25, 2021
Conservative MP Damian Collins, chair of the joint committee on the draft Online Safety Bill, asked specifically about regulation of Facebook groups. Haugen responded that Facebook should both require moderators for each group and create a tangible distinction between benign interest groups and those that are primarily used to amplify and spread disinformation.
The Drum asked for comment from Facebook and was directed to an op-ed by its head of global policy management Monika Bickert that was previously published in The Telegraph.
Muddying the waters is that Haugen’s testimony is somewhat tangential to the subject matter of the hearing itself. It is about the mooted Online Safety Bill, which ostensibly aims to cut down on online abuse, but has been widely criticized for being too blunt a tool that would harm marginalized people online.
Haugen said: “I think it’s important to weigh what is the incremental value of requiring real names ... real names are difficult to implement. Most countries in the world do not have legal services where we could verify [real names]. And in a world where someone can use [a VPN] and claim they’re in one of those countries and register a profile that means they can still do whatever action you’re afraid of.”
She noted that this was a potential issue, particularly for domestic abuse survivors, gay people who haven’t come out yet but are looking for community, and other groups in potential danger of reprisal from areas of society. She cites both Google and Twitter as examples of companies that are performing better than Facebook “because they know someone is watching.”
Despite that, Collins has accused Facebook of trying to “stop any insights” into how it manages harmful content being published.
In a statement, he said: “There needs to be greater transparency on the decisions companies like Facebook take when they trade off user safety for user engagement.”
Leaked documents have also demonstrated that Facebook allocates 83% of its budget for countering misinformation to the US alone, with the rest of the world receiving the remaining 17%. Haugen also says that Facebook is “actively misleading” the speakers of most languages because its AI cannot pick out the signifiers of hateful speech as it can with English.
Over the weekend a group of newspapers – primarily based in the US – made a concerted effort to dig through the vast reams of internal data and details leaked from within Facebook. The goal of the group was to provide a holistic view of what the social network and advertising giant knew about its own failings – and when.
One of the big revelations of the past few days is that Facebook deliberately chose not to hold right-wing news outlets to the standards it had set for news publishers. Reportedly it had a specific exemption for Breitbart – and has been accused of helping propagate information that led to an attempted coup in the Capitol on January 6.
However, a much more fundamental issue is the revelation that in an August 2019 internal memo researchers said it was Facebook’s “core product mechanics” that were responsible for letting misinformation flourish on the platform. In other words, researchers claimed the harm was indivisible from the core product.
Despite that, the damage to Facebook’s reputation from the leaks is already done. It is reportedly mooting a rebranding that could arrive as soon as this week (or timed to coincide with its results tonight). Haugen believes that it is incumbent upon governments and potential regulators to act as soon as possible to prevent further harm: “Right now Facebook is closing the door on us being able to act ... we have to be able to take advantage of this moment.”
Worse still for the social network, Axios has reported that its vice-president of global affairs Nick Clegg has warned of “more bad news” for the company in the very near future.