Tuesday, March 19, 2024
Social

YouTube rolls back its rules against election misinformation

YouTube was the slowest major platform to disallow misinformation during the 2020 U.S. election and almost three years later, the company will toss that policy out altogether.

The company announced Friday that it would reverse its rules around election denialism, allowing some previously prohibited false claims, effective immediately. Axios first reported the changes.

“In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm,” the company wrote in a blog post.

“With that in mind, and with 2024 campaigns well underway, we will stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US Presidential elections.”

YouTube still won’t allow some kids of false election-related claims, like lying about the location of polling places and other specific efforts to dissuade people from successfully casting a vote.

“All of our election misinformation policies remain in place, including those that disallow content aiming to mislead voters about the time, place, means, or eligibility requirements for voting; false claims that could materially discourage voting, including those disputing the validity of voting by mail; and content that encourages others to interfere with democratic processes,” the company wrote.

There’s certainly an argument that, on the whole, denying the valid results of a presidential election ultimately does more to discourage people from voting than these more targeted hypothetical scenarios. But it doesn’t appear that allowing users to sow broad mistrust in the democratic process fits into the company’s definition of “real-world harm.”

Even if enforcement was challenging, it’s a strange choice to announce that it’s open season for U.S. election denial on YouTube, particularly with the 2024 race gearing up. The company plans to offer more updates around its 2024 election strategy in the next few months, so hopefully YouTube elaborates on its thinking or other planned precautions then.

YouTube rolls back its rules against election misinformation by Taylor Hatmaker originally published on TechCrunch

Source link

Share with your friends!

Products You May Like

Leave a Reply

Your email address will not be published. Required fields are marked *

x  Powerful Protection for WordPress, from Shield Security
This Site Is Protected By
Shield Security