Facebook Announces New ‘Safeguards’ for Midterms to Block ‘Misinformation’

Facebook’s parent company Meta has announced its new “policies and safeguards” that seek to block “misinformation” ahead of the November midterm elections.

The social media giant says it will prevent “election and voter interference” by using so-called “fact-checkers” to “connect” people with “reliable information.”

In a Tuesday blog post, Meta revealed that its new series of policies for the 2022 midterms are “consistent” with the same policies the platform used during the 2020 election.

The blog post failed to mention, however, that Facebook CEO Mark Zuckerberg invested hundreds of millions of dollars to help Democrats win during the 2020 election cycle.

Meta’s election policies include “Preventing Election and Voter Interference,” “Connecting People With Reliable Information,” and “Transparency Around Elections and Advertising.”

Meta President of Global Affairs Nick Clegg said in the announcement that Facebook used its “learnings from the 2020 election cycle” to apply “advanced security operations” for the midterms.

“With the 2022 US midterms on the horizon, we are setting out how our approach applies in this election cycle, which is largely consistent with the policies and safeguards we had in place during the 2020 US Presidential election,” Clegg wrote.

“Our approach to the 2022 US midterms applies learnings from the 2020 election cycle and exceeds the measures we implemented during the last midterm election in 2018.

“This includes advanced security operations to fight foreign interference and domestic influence campaigns, our network of independent fact-checking partners, our industry-leading transparency measures around political advertising and pages, as well as new measures to help keep poll workers safe,” the announcement added.

Under the heading “Preventing Election and Voter Interference,” Meta touted its existing policies around removing “hate speech.”

The company boasts that it had already banned 270 “white supremacist” organizations and removed more than 2.5 million posts for “hate speech” in the first quarter of 2022 alone.

Meta then touted its relationships with state and local election officials, federal agencies, and industry peers.

It went on to outline its content removal policies:

Slay the latest News for free!

We don’t spam! Read our privacy policy for more info.

“As was the case in the US in 2020, election-related content we will remove includes misinformation about the dates, locations, times, and methods of voting; misinformation about who can vote, whether a vote will be counted, and qualifications for voting; and calls for violence related to voting, voter registration, or the administration or outcome of an election.

“We will reject ads encouraging people not to vote or calling into question the legitimacy of the upcoming election.”

The company also announced that it would prohibit new political, electoral, and social issue ads during the final week before the election.

Meta announced additional policies under the heading “Connecting People With Reliable Information.”

The company lauded its continuing feature of sending notifications to Facebook users with voting information, and its partnership with election officials.

Meta also announced it would be implementing Spanish-language features for voters who interact with Spanish-language content.

It also outlined changes to its election “fact-checking” system.

“We have 10 fact-checking partners in the US to address viral misinformation,” the company said.

“We add warning labels to content they debunk so that people can decide for themselves what to read, trust and share.”

Meta failed to note that information that has been debunked by “fact-checkers” is blocked from appearing in Facebook’s NewsFeed, therefore removing the opportunity for people to “decide for themselves.”

The announcement continues by revealing that it plans to pump millions of dollars into “fact-checking” efforts for the midterms:

“We’re also investing an additional $5 million in fact-checking and media literacy initiatives ahead of the midterms,” including fact-checking services on WhatsApp; giving money to fact checkers to increase their capacity during the elections, and developing “media literacy resources to teach people how to identify misinformation for themselves.”

The changes to Meta’s policies come as the company announced that it would decrease the reach of political content on Facebook more broadly.

Republican House Minority Leader Kevin McCarthy (R-CA) blasted the platform in a tweet earlier this month.

“Facebook made sure no one saw the Hunter Biden laptop story before the 2020 election,” McCarthy wrote.

“But now that America has record inflation, rising crime, & a border crisis — all as a result of Dem policies — Facebook is shutting down more ‘political content’ to hide the truth from Americans.”

Last week, Twitter also introduced a similar set of policies to “protect civic conversation” ahead of the midterms, as Slay News reported.

SHARE:
Advertise with Slay News
join telegram

READERS' POLL

Who is the best president?

By completing this poll, you gain access to our free newsletter. Unsubscribe at any time.

By Frank Bergman

Frank Bergman is a political/economic journalist living on the east coast. Aside from news reporting, Bergman also conducts interviews with researchers and material experts and investigates influential individuals and organizations in the sociopolitical world.

Subscribe
Notify of
0
Would love your thoughts, please comment.x
()
x