White House: We’re flagging Facebook posts that spread COVID ‘misinformation’

The Biden administration has been “flagging problematic posts” for Facebook to remove, White House press secretary Jen Psaki disclosed Thursday.

She said during the daily media briefing that the Office of the Surgeon General has ramped up its tracking of “disinformation” related to the COVID-19 pandemic on social media platforms.

“We’re flagging problematic posts for Facebook that spread disinformation. We’re working with doctors and medical experts … who are popular with their audience with accurate information,” she said. “So, we’re helping get trusted content out there.”

See Psaki’s remarks:

Along with artificial intelligence tools, Facebook deploys panels of fact-checkers who remove information that doesn’t comport with the views of selected “experts.”

Until recently, as establishment media began to pay attention to the evidence, Facebook censored stories on the possibility that the COVID-19 pandemic originated in a lab in Wuhan, China.

Psaki said the White House also has requested that Facebook share relevant data regarding any posts that “spread disinformation.” The administration, she said, is asking Facebook to “measure and publicly share the impact of misinformation on their platform.”

“Facebook should provide, publicly and transparently, data on the reach of COVID-19 vaccine misinformation, not just engagement, but the reach of the misinformation and the audience that it’s reaching,” Psaki said. “That will help us ensure we’re getting accurate information to people. This should be provided not just to researchers but to the public so that the public knows and understands what is accurate and inaccurate.”

The White House also wants Facebook and other social media platforms to “create a robust enforcement strategy” and to take “faster action against harmful posts.”

Psaki, without mentioning any names, said there’s “about 12 people who are producing 65% of anti= vaccine misinformation on social media platforms.”

“All of them remain active on Facebook, despite some even being banned on other platforms, including Facebook, ones that Facebook owns,” she lamented.

Psaki added that Facebook “has repeatedly shown that they have the levers to promote quality information.”

“We have seen them effectively do this in their algorithm over low quality information and they’ve chosen not to use it in this case and that is certainly an area that would have an impact,” she said.

In February, Facebook announced a plan to mobilize a panel of scholars to “debunk myths about climate change.”

At the time, CNBC described Facebook’s move as “further leaning in to the ‘arbiter of truth’ role that the company once renounced.”

It was a futher move away from the statement of CEO Mark Zuckerberg in May 2020 that Facebook and other internet platforms “in general should be arbiters of truth.”

But now Zuckerberg contends that the best way to stop the spread of misinformation on its platform is not only to remove “misleading” posts, but to offer people “accurate information from authoritative sources.”