The tech giant “has not been fully forthcoming on Cross-Check,” the oversight board said in a report published on Thursday. “On some occasions, Facebook failed to provide relevant information to the Board, while in other instances, the information it did provide was incomplete,” it added.
Facebook uses Cross-Check to review content decisions relating to high-profile users, such as politicians, celebrities and journalists. The program had mushroomed to include 5.8 million users in 2020, according to the Wall Street Journal.
The Facebook Oversight Board is an entity made up of experts in areas such as freedom of expression and human rights. They are appointed by the company but operate independently. The Oversight Board often described as a kind of Supreme Court for Facebook as it allows users to appeal content decisions on Facebook-owned platforms.
In a report published last month, the Wall Street Journal used internal company documents to show that Cross-Check protects VIPs from Facebook’s ( normal enforcement processes. In practice that means that posts that violate the company’s rules are not immediately removed or certain individuals are immune from enforcement actions. )
“At times, the documents show, [Cross-Check] has protected public figures whose posts contain harassment or incitement to violence, violations that would typically lead to sanctions for regular users,” according to the Journal.
In a written statement, Facebook spokesman Andy Stone told the Journal that criticism of Cross-Check was fair, but added that the system “was designed for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding.”
Despite Cross-Check’s considerable size, Facebook did not mention the program when it asked the oversight board to review its decision to block former President Donald Trump from using its platform. Instead, Facebook only mentioned the program when the oversight board asked whether Trump’s page or account had been subject to ordinary content moderation processes.
Facebook told the oversight board that the program applied only “to a small number of decisions,” which the company subsequently acknowledged was misleading, the board said. It also provided “no meaningful transparency on the criteria for accounts or pages being selected for inclusion in Cross-Check” despite a request from the board to do so.
The board said Thursday that it has accepted a request from Facebook to review Cross-Check and make recommendations on how it can be changed.