Whistleblower Frances Haugen has told MPs Facebook is “unquestionably making hate worse”, as they consider what new rules to impose on big social networks.
Ms Haugen was talking to the Online Safety Bill committee in London.
She said Facebook safety teams were under-resourced, and “Facebook has been unwilling to accept even little slivers of profit being sacrificed for safety”.
And she warned that Instagram was “more dangerous than other forms of social media”.
While other social networks were about performance, play, or an exchange of ideas, “Instagram is about social comparison and about bodies… about people’s lifestyles, and that’s what ends up being worse for kids”, she told a joint committee of MPs and Lords.
She said Facebook’s own research described one problem as “an addict’s narrative” – where children are unhappy, can’t control their use of the app, but feel like they cannot stop using it.
“I am deeply worried that it may not be possible to make Instagram safe for a 14-year-old, and I sincerely doubt that it is possible to make it safe for a 10-year-old,” she said.
The committee is fine-tuning a proposed law that will place new duties on large social networks and subject them to checks by the media regulator Ofcom.
Asked if the law was “keeping Mark Zuckerberg awake at night”, Ms Haugen said she was “incredibly proud of the UK for taking such a world-leading stance”.
“The UK has a tradition of leading policy in ways that are followed around the world.
“I can’t imagine Mark isn’t paying attention to what you’re doing.”
British English problem
Ms Haugen also warned that Facebook was unable to police content in multiple languages around the world – something which should worry UK officials, she said.
“UK English is sufficiently different that I would be unsurprised if the safety systems that they developed primarily for American English were actually under-enforcing in the UK,” she said.
And she said that dangerous misinformation in other languages affects people in Britain.
“Those people are also living in the UK, and being fed misinformation that is dangerous, that radicalises people,” she warned.
Ms Haugen also urged the committee to include paid-for advertising in its new rules, saying the current system was “literally subsidising hate on these platforms” because of their algorithmic ranking.
“It is substantially cheaper to run an angry hateful divisive ad than it is to run a compassionate, empathetic ad,” she said.
And she also urged MPs to require a breakdown of who is harmed by content, rather than an average figure – suggesting Facebook is “very good at dancing with data”, but pushes people towards “extreme content”.
“The median experience on Facebook is a pretty good experience,” she said.
“The real danger is that 20% of the population has a horrible experience or an experience that is dangerous,” she said.
“Accept under-resourcing”
She warned that employees were unable to report internal concerns at Facebook – something she called a “huge weak spot”.
“When I worked on counter-espionage, I saw things where I was concerned about national security, and I had no idea how to escalate those because I didn’t have faith in my chain of command at that point,” she told the committee.
And she warned: “We were told to accept under-resourcing.”
Similar problems plague Facebook’s Oversight Board, which can overturn the company’s decisions on content, she said. She repeated her claim that Facebook has repeatedly lied to its own watchdog, and said this is a “defining moment” for the Oversight Board to “step up”.
“I don’t know what the purpose of the Oversight Board is,” she said
It comes as several news outlets published fresh stories based on the thousands of leaked documents Ms Haugen took with her when she left Facebook.
Facebook has characterised previous reporting as misleading, and at one point referred to the leaked documents as “stolen”.
“Contrary to what was discussed at the hearing, we’ve always had the commercial incentive to remove harmful content from our sites,” a spokesperson said, after Ms Haugen finished giving evidence.
“People don’t want to see it when they use our apps, and advertisers don’t want their ads next to it. That’s why we’ve invested $13bn (£9.4bn) and hired 40,000 people to do one job: keep people safe on our apps. “
The company said that over the last three quarters it has halved the amount of hate speech seen on Facebook, which it claims now accounts for only 0.05% of all content viewed.
“While we have rules against harmful content and publish regular transparency reports, we agree we need regulation for the whole industry so that businesses like ours aren’t making these decisions on our own,” the spokesperson said.
“The UK is one of the countries leading the way and we’re pleased the Online Safety Bill is moving forward.”
An avalanche of information emerged on Monday from leaked Facebook documents – and it was hard to keep up.
Allegations include that the social media giant is aware of its role in inciting violence all around the world, or causing harm to its users from US and UK to India and Ethiopia.
A common theme runs through each of the stories. They all suggest a tension between employees raising the alarm about their concerns and a corporate machine that does not appear to be using this to inform its policies.
Reporters and journalists have been highlighting many of these same concerns, especially for the past 18 months. I’ve investigated the human cost of online disinformation and abuse again and again and exposed the damage being done to real people offline using these sites.
But until these documents were released by Ms Haugen, it was very difficult to know how aware Facebook was of that damage.
These latest leaks reinforce the idea that it is conscious of it – although it refutes a number of the claims.
And it means pressure is mounting on policymakers around the world to do something about it.
bbc