This week, Facebook has faced a series of accusations about its internal workings, based on revelations in the Wall Street Journal and elsewhere.
Much of the information comes from Facebook’s own internal documents, suggesting the company now has some whistle-blowers in its ranks.
The documents will provide governments and regulators with plenty to pore over as they consider their next moves.
However, Facebook has defended itself against all the accusations.
Here are five things that were revealed this week:Celebrities were treated differently by FacebookAccording to documents reported by the Wall Street Journal, many celebrities, politicians and high-profile Facebook users had different rules governing what content they could post, under a system known as XCheck (cross-check).
Facebook has admitted criticism of the way it implemented its cross-check system was “fair” – but said it was designed to create “an additional step” when posted content required more understanding.
“This could include activists raising awareness of instances of violence or journalists reporting from conflict zones,” it said.
It said a lot of documents referred to by the Wall Street Journal contained “outdated information stitched together to create a narrative that glosses over the most important point: Facebook itself identified the issues with cross-check and has been working to address them”.
Despite its pushback, Facebook’s own Oversight Board, which it appointed to make decisions on tricky content moderation, has demanded more transparency.
In a blog this week, it said the disclosures had “drawn renewed attention to the seemingly inconsistent way that the company makes decisions”.
It has asked for a detailed explanation of how the cross-check system works.
It warned a lack of clarity on cross-check could contribute to perceptions that Facebook was “unduly influenced by political and commercial considerations”.
Since it began its work looking into how Facebook moderates content, the Facebook-funded Oversight Board has made 70 recommendations about how the company should improve its policies. It has now set up a team to assess how the social network implements those recommendations.
Its response to employee concerns about human trafficking was often ‘weak’
The documents reported by the WSJ also suggested Facebook employees regularly flagged information about drug cartels and human traffickers on the platform but the company’s response was “weak”.
In November 2019, BBC News Arabic broadcast a report highlighting the issue of domestic workers for sale on Instagram.
According to internal documents, Facebook was already well aware of the issue. The WSJ reported that Facebook took only limited action until Apple threatened to remove its products from its App Store.
In its defence, Facebook said it had a “comprehensive strategy” to keep people safe including “global teams with native speakers covering over 50 languages, educational resources and partnerships with local experts and third-party fact-checkers”.
Critics warn that Facebook does not have the means to moderate all the content on its platform and protect its 2.8 billion users.
David Kirkpatrick, author of The Facebook Effect, told the BBC’s Tech Tent podcast that he felt Facebook had no motivation “to do anything to mediate the harms” outside the US.
“They have plenty of things they have done, including hiring tens of thousands of content reviewers,” he said.
“But one statistic that jumped out for me from the Wall Street Journal was that for all their disinformation and misinformation work in 2020, only 13% of that work was outside of the United States.
“For a service that is 90% outside of the United States – and one that has had enormous impact, in a very negative way, on the politics of countries like the Philippines, Poland, Brazil, Hungary, Turkey – they are not doing anything to remediate all that.”
Mr Kirkpatrick suggested Facebook was only “responsive to PR pressures” in the US because those could affects its share price.
Facebook faces a huge lawsuit from shareholders
Facebook is also facing a complex lawsuit from a group of its own shareholders.
The group alleges, among other things, that Facebook’s $5bn (£3.65bn) payment to the US Federal Trade Commission to resolve the Cambridge Analytica data scandal was so high because it was designed to protect Mark Zuckerberg from personal liability.
Facebook said it did not have anything to say about the continuing legal matter.
Has Facebook been promoting positive stories about itself?
This week, the New York Times suggested that Facebook had started an initiative to pump pro-Facebook content into people’s news feeds in order to boost its image.
The newspaper said Project Amplify was designed to “show people positive stories about the social network”.
Facebook said there had been no changes to its newsfeed ranking systems.
In a series of tweets, spokesman Joe Osborne said the test of what he called “an informational unit on Facebook” was small and only happened in “three cities”, with posts clearly labelled as coming from the firm.
He said it was “similar to corporate responsibility initiatives people see in other technology and consumer products”.