New York

  

In a number of sweeping changes that will significantly alter the way that posts, videos and other content are moderated online, Meta will adjust its content review policies on Facebook and Instagram, getting rid of fact checkers and replacing them with user-generated “community notes,” similar to Elon Musk’s X, CEO Mark Zuckerberg announced Tuesday.

The changes come just before President-elect Donald Trump is set to take office. Trump and other Republicans have lambasted Zuckerberg and Meta for what they view as censorship of right-wing voices.

“Fact checkers have been too politically biased and have destroyed more trust than they’ve created,” Zuckerberg said in a video announcing the new policy Tuesday. “What started as a movement to be more inclusive has increasingly been used to shut down opinions and shut out people with different ideas, and it’s gone too far.”

Zuckerberg, however, acknowledged a “tradeoff” in the new policy, noting more harmful content will appear on the platform as a result of the content moderation changes.

Meta’s newly appointed Chief of Global Affairs Joel Kaplan told Fox on Tuesday that Meta’s partnerships with third-party fact checkers were “well intentioned at the outset but there’s just been too much political bias in what they choose to fact check and how.”

The announcement comes amid a broader apparent ideological shift to the right within Meta’s top ranks, and as Zuckerberg seeks to improve his relationship with Trump before the president-elect takes office later this month. Just one day earlier, Meta announced Trump ally and UFC CEO Dana White would join its board, along with two other new directors. Meta has also said it will donate $1 million to Trump’s inaugural fund, and that Zuckerberg wants to take an “active role” in tech policy discussions.

Kaplan, a prominent Republican who was elevated to the company’s top policy job last week, acknowledged that the Tuesday announcement is directly related to the changing administration.

He said that there’s “no question that there has been a change over the last four years. We saw a lot of societal and political pressure, all in the direction of more content, moderation more censorship, and we’ve got a real opportunity. Now, we’ve got a new administration, and a new president coming in who are big defenders of free expression, and that makes a difference.”

Meta gave Trump’s team an advanced heads up that the moderation policy change was coming, a source familiar with the conversation told .

The Real Facebook Oversight Board an outside accountability organization, whose name is a play on the company’s official group, comprised of academics, lawyers and civil rights advocates including early Facebook investor Roger McNamee said the policy changes represent Meta going “full MAGA.”

“Meta’s announcement today is a retreat from any sane and safe approach to content moderation,” the group said in a statement, calling the changes “political pandering.”

The moderation changes mark a stunning reversal in how Meta handles false and misleading claims on its platforms.

In 2016, the company launched an independent fact-checking program, in the wake of claims that it had failed to stop foreign actors from leveraging its platforms to spread disinformation and sow discord among Americans. In the years since, it continued to grapple with the spread of controversial content on its platform, such as misinformation about elections, anti-vaccination stories, violence and hate speech.

The company built up safety teams, introduced automated programs to filter out or reduce the visibility of false claims and instituted a sort of independent Supreme Court for tricky moderation decisions, known as the Oversight Board.

Although Meta’s fact checking partners repeatedly said they checked claims from both the right and left, Trump supporters and other conservatives have long claimed that the system restricted their voices.

“Anything I put on there about our president is generally only on for a few minutes and then suddenly they’re fact checking me saying this that and the other thing, which I know is not true. Their fact checker’s wrong,” one Trump supporter told at a rally in 2020.

But now, Zuckerberg is following in the footsteps of fellow social media leader Musk who, after acquiring X, then known as Twitter, in 2022, dismantled the company’s fact-checking teams and made user-generated context labels called community notes the platform’s only method of correcting false claims.

Starting in the United States, Meta says it is ending its partnership with third-party fact checkers and instituting similar, community notes across its platforms, including Facebook, Instagram and Threads.

“I think Elon has played an incredibly important role in moving the debate and and getting people refocused on free expression, and that’s been really constructive and productive,” Kaplan said.

The company also plans to adjust its automated systems that scan for policy violations, which it says have resulted in “too much content being censored that shouldn’t have been.” The systems will now be focused on checking only for illegal and “high-severity” violations such as terrorism, child sexual exploitation, drugs, fraud and scams. Other concerns will have to be reported by users before the company evaluates them.

Zuckerberg said Tuesday that Meta’s complex systems to moderate content have mistakenly resulted in too much non-violating content being removed from the platform. For example, if the systems get something wrong 1% of the time, that could represent millions of the company’s more than 2 billion users.

“We’ve reached a point where it’s just too many mistakes and too much censorship,” Zuckerberg said.

But Zuckerberg acknowledged that the new policy could create new problems for content moderation.

“The reality is this is a tradeoff,” he said in the video. “It means that we’re going to catch less bad stuff, but we’ll also reduce the number of innocent people’s posts and accounts that we accidentally take down.”

The company is also getting rid of content restrictions on certain topics, such as immigration and gender identity, and rolling back limits on how much politics-related content users see in their feeds.

As part of the changes, Meta will move its trust and safety teams responsible for content policies from California to Texas and other US locations. “I think that will help us build trust to do this work in places where there is less concern about the bias of our teams,” Zuckerberg said.

This is a developing story and will be updated.

Share.

Leave A Reply

Exit mobile version