Facebook Oversight Board criticizes company for not being ‘fully forthcoming’

Over the last several weeks, Facebook has received renewed attention over the social media conglomerate’s transparency, prompting the company’s oversight board to publish its first quarterly report detailing recommendations for Facebook.

The transparency report, published Thursday, details cases the board has received from users, the decisions it has taken and its recommendations to the social media giant.

"Today’s reports conclude that Facebook has not been fully forthcoming with the Board on its ‘cross-check’ system, which the company uses to review content decisions relating to high-profile users," the board wrote in a press release Thursday. 

"On some occasions, Facebook failed to provide relevant information to the Board, while in other instances, the information it did provide was incomplete," the board continued in its report.

Since publishing its first decisions in January, the Facebook Oversight Board said it has pushed Facebook to reveal more information about how it works and to treat its users fairly, citing it has taken on 20 important cases and issued 17 decisions covering topics from hate speech to COVID-19 misinformation. 

An estimated 46% of cases came from the U.S. and Canada alone, with more than one-third of the cases submitted pertaining to hate speech, and about another third pertaining to bullying and harassment cases.

In the 77-page document, the board said it made 52 recommendations to Facebook and received nearly 10,000 public comments. 

"The vast majority of these comments (9,666) related to the case on former US President Donald Trump," the board wrote in its report.

In June, Facebook suspended Trump from its site for two years in response to its independent oversight board ruling. 

272ae40b-fb

Sign with logo at the headquarters of social network company Facebook in Silicon Valley, Menlo Park, California, November 10, 2017. (Photo by Smith Collection/Gado/Getty Images)

In May, the oversight board upheld Facebook’s suspension of the former president's Facebook and Instagram accounts following his praise for the rioters committing violence during the Capitol riot on Jan. 6. 

"We are today announcing new enforcement protocols to be applied in exceptional cases such as this, and we are confirming the time-bound penalty consistent with those protocols which we are applying to Mr. Trump’s accounts," Facebook wrote.

RELATED: Facebook suspends Trump for 2 years after oversight board ruling

"Given the gravity of the circumstances that led to Mr. Trump’s suspension, we believe his actions constituted a severe violation of our rules which merit the highest penalty available under the new enforcement protocols. We are suspending his accounts for two years, effective from the date of the initial suspension on January 7 this year," the company wrote.

At the end of the suspension period, Facebook said it will look to experts to assess whether the risk to public safety has receded. 

Trump has been suspended from the platform since Jan. 7, the day after a mob of his supporters engaged in the deadly riot at the U.S. Capitol.

The board said it was committed to looking at whether Facebook had been specifically forthcoming in its responses on its crosscheck system. 

"When Facebook referred the case related to former US President Trump to the Board, it did not mention the cross-check system. Given that the referral included a specific policy question about account-level enforcement for political leaders, many of whom the Board believes were covered by crosscheck, this omission is not acceptable," the report read.

Facebook noted that for teams operating at the scale of millions of content decisions a day, the numbers involved with cross-check seem relatively small, but recognized its phrasing could come across as misleading, according to the oversight board.

"We also noted that Facebook’s response to our recommendation to ‘clearly explain the rationale, standards and processes of [cross-check] review, including the criteria to determine which pages and accounts are selected for inclusion’ provided no meaningful transparency on the criteria for accounts or pages being selected for inclusion in cross-check," the board continued. "The credibility of the Oversight Board, our working relationship with Facebook, and our ability to render sound judgments on cases all depend on being able to trust that information provided to us by Facebook is accurate, comprehensive, and paints a full picture of the topic at hand." 

The board said it will continue to track and report on information provided by Facebook to ensure it is "as comprehensive and complete as possible."

On Thursday, the board also announced it had accepted a request from Facebook, in the form of a policy advisory opinion, to review the company’s cross-check system and make recommendations on how it can be changed. 

RELATED: Facebook blames global outage on error during routine maintenance

The oversight board said it will also publish annual reports with a more detailed assessment of how Facebook is implementing the board’s decisions and recommendations.

The board’s announcement comes weeks after a former Facebook employee, Frances Haugen, went before Congress and cameras to accuse the social media giant of pursuing profit over safety.

The 37-year-old data scientist is by far the most visible of those whistleblowers. And her accusations that Facebook’s platforms harm children and incite political violence — backed up by thousands of pages of the company’s own research — may well be the most damning.

RELATED: Facebook pays millions to settle DOJ discrimination claims in hiring process

Haugen, who said she joined the company in 2019 because "Facebook has the potential to bring out the best in us," said she didn’t leak internal documents to a newspaper and then come before Congress in order to destroy the company or call for its breakup, as many consumer advocates and lawmakers of both parties have called for.

"Facebook’s products harm children, stoke division and weaken our democracy," Haugen said during her testimony on Oct. 5. "The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people."

Last week, Facebook also announced it would be implementing more stringent policies to better protect public figures from cyberbullying. 

RELATED: Facebook broadening anti-bullying policies to protect public figures

The social media giant said it would introduce policies to help protect people from "mass harassment and intimidation" and will now remove "more harmful content" that attacks public figures. The company also said it would be providing better protections to public figures who became famous involuntarily, such as human rights activists and journalists.

The company outlined its plan to remove content that showed a coordinated effort by a large number of people whose only goal is to harass a particular person or group of people who are at heightened risk for offline harm, "for example victims of violent tragedies or government dissidents," the news release said.

This story was reported from Los Angeles. The Associated Press contributed to this report.

NewsFacebookTechnologyU.S.