Meta users are being left to wade through hate and misinformation
Fact-Checking the False: How a Social Media Platform Can Help to Eliminate False Claims on Social Media
Meta is essentially shifting responsibility to users to weed out lies on Facebook, Instagram, Threads, and WhatsApp, raising fears that it’ll be easier to spread misleading information about climate change, clean energy, public health risks, and communities often targeted with violence.
A lot of people think moderation doesn’t work, and it’s just window dressing so that platforms can say they are doing something, and most people do not want to have to wade through a bunch of misinformation. “The losers here are people who want to be able to go on social media and not be overwhelmed with false information.”
Duke is very disappointed to hear Mark Zuckerberg say that the organizations in Meta’s US third-party fact-checking program are politically biased. “Let me fact-check that. Lead Stories is a publication that follows the highest standards of journalism and ethics. We don’t look at the political spectrum to see if a false claim is made.
Poynter owns PolitiFact, which is one of the fact-checking partners Meta works with in the US. She worked at PolitiFact before taking over as the editor-in-chief at IFCN. What makes the fact-checking program effective is that it serves as a “speed bump in the way of false information,” Holan says. When it is flagged, content has a screen over it that informs users if the claim is legitimate and if they want to see it.
The process covers a wide range of topics, including false information about celebrities dying and claims about miracle cures. Meta launched its program in 2016 because of public concern about the potential for fake stories to be spread on social media.
The Meta scandal is a terrible decision: Donald Trump’s first tweet and his fight against anti-defame tweets have been dismissed by the Obama administration
“Zuck’s announcement is a full bending of the knee to Trump and an attempt to catch up to [Elon] Musk in his race to the bottom. The implications are going to be widespread,” Nina Jankowicz, CEO of the nonprofit American Sunlight Project and an adjunct professor at Syracuse University who researches disinformation, said in a post on Bluesky.
Twitter launched its community moderation program, called Birdwatch at the time, in 2021, before Musk took over. Musk, who helped fund Trump’s campaign and is now set to lead a new department of government efficiency, leaned into Community Notes after the teams responsible for content moderation at Twitter were slashed. Hate speech — including slurs against Black and transgender people — increased on the platform after Musk bought the company, according to research by the Center for Countering Digital Hate. A federal judge has now dismissed the case between Musk and the center.
Meta says that it is getting rid ofrestrictions on topics like immigration, gender identity and gender that are the subject of political discourse and debate.
Environmental groups are not happy with the changes at Meta. “Mark Zuckerberg’s decision to abandon efforts to check facts and correct misinformation and disinformation means that anti-scientific content will continue to proliferate on Meta platforms,” Kate Cell, senior climate campaign manager at the Union of Concerned Scientists, said in an emailed statement.
“I think this is a terrible decision … disinformation’s effects on our policies have become more and more obvious,” says Michael Khoo, a climate disinformation program director at Friends of the Earth. He points to attacks on wind power affecting renewable energy projects as an example.
Khoo also likens the Community Notes approach to the fossil fuel industry’s marketing of recycling as a solution to plastic waste. It is difficult to recycle plastic products since they are not really recyclable and the tide of plastic pollution is still flooding into the environment. The strategy also puts the onus on consumers to deal with a company’s waste. “[Tech] companies need to own the problem of disinformation that their own algorithms are creating,” Khoo tells The Verge.
X-style Community Notes: An Update on Meta’s News Operations in the US and Implications for Fact-Checking in the United States
The announcement was made in a post by Meta’s newly-appointed chief global affairs officer, who said the move was to allow more topics to be openly discussed. The change will first impact the company’s moderation in the US.
“We will allow more speech by lifting restrictions on some topics that are part of mainstream discourse and focusing our enforcement on illegal and high-severity violations,” Kaplan said, though he did not detail what topics these new rules would cover.
In a video accompanying theblog post, Meta CEO MarkMark said the new policies would see more political content return to peoples feeds as well as posts on other issues that have been causing problems in the US in recent years.
Meta was criticized for its handling of moderation of content related to the high profile elections that took place last year.
Kaplan criticized fact-checking experts for their biases and perspectives, which led to over-moderation, because people would understand to be legitimate political speech and debate.
However WIRED reported last year that dangerous content like medical misinformation has flourished on the platform while groups like anti-government militias have utilized Facebook to recruit new members.
In a bid to remove bias, Zuckerberg said Meta’s in-house trust and safety team would be moving from California to Texas, which is also now home to X’s headquarters. In areas where there isn’t any concern about the bias of our teams, we will build trust to work on free expression.
Duke says Lead Stories has a diverse revenue stream and most of its operations are outside of the US, but he claims the decision would still have an impact on them. “The most painful part of this is losing some very good, experienced journalists, who will no longer be paid to research false claims found on Meta platforms,” Duke says.
The news organizations who had partnered with Meta to tackle the spread of disinformation on the platform from 2016 are scrambling to figure out how this change will impact them.
Ten of these fact-checking organizations are based in the US, and Meta’s new rules will be applied to them first.
The news that Meta was no longer planning on using their services was announced in a blog post by chief global affairs officer Joel Kaplan on Tuesday morning and an accompanying video from Meta CEO Mark Zuckerberg. X-style Community Notes allow users to flag things that they think are inaccurate or require further explanation, and the company plans to use them.
Alan Duke is the editor and co-founder of Lead Stories, a fact-checking website that started working with Meta in 2019. No notice of the change.