YouTube has announced a change to its election misinformation policy. As of June 2, 2023, the website will not remove content claiming “widespread fraud, errors or disturbances” in past elections, including the 2020 US presidential election.
This change seeks to balance YouTube’s commitment to safety and open political discourse. With another US presidential election approaching in 2024, you may be wondering what exactly this means and why YouTube would be making this change now.
How YouTube’s Election Misinformation Policy Started
YouTube’s Election Misinformation Policy includes the platform’s standards for publishers and creators covering or commenting on the election. In December 2020, YouTube banned content making false claims about “widespread fraud, errors or disturbances” in the 2020 US presidential election.
The platform implemented this policy after the December 8 “Safe Harbor Deadline” for certification for that year’s election. According to US federal law, the safe harbor deadline is six days before the Electoral College vote. This is the deadline for each state to certify election results or address contested results.
YouTube is more than crowdsourced entertainment. According to the Pew Research Center, it was also a news source for more than one-quarter of all Americans in 2020, including content from mainstream media outlets as well as independent creators.
Additionally, according to the US Census Bureau, the 2020 election saw the highest voter turnout in the 21st century, with over 68% of eligible citizens voting. Due to a combination of higher-than-usual voter turnout and YouTube’s wide reach, the platform attracted creators with a whole range of beliefs and theories about the election.
In a YouTube blog post, the platform claims it removed “tens of thousands” of videos containing content that violated its “election misinformation policy” over the past two years following the 2020 election. In a separate YouTube blog post, YouTube also noted that more than 77% of these videos were removed before reaching 100 views.
Of course, YouTube wasn’t the only platform to do so. For example, TikTok removed over 300,000 videos for election misinformation.
While the 2020 US presidential election was the catalyst for the election misinformation policy update, YouTube’s standards apply to other elections outside the US, for example, in policy guidelines specifically for the 2021 German federal election and the 2014, 2018 and 2022 Brazilian elections. Presidential elections have been mentioned.
Why did YouTube change its misinformation policy?
YouTube said that its interest in allowing political discourse prompted the June 2023 update.
“While removing this content curbs some misinformation, it may also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm.”
This does not mean that YouTube has rolled back all restrictions. According to its updated election misinformation policy, “deceptive or misleading content with a serious risk of serious harm” is still not allowed, eg.
YouTube also maintains its restrictions against harassment, hate speech and incitement to violence, which apply to all content, including election commentary. In the media landscape, YouTube is rolling back its restrictions related to the 2020 US presidential election, similar to the changes Twitter made earlier in 2023.
CNBC reported that the tech company’s layoffs affected the trust and safety teams at Meta, Google and Twitter. With a smaller team, implementing the previous policy will be even more difficult in 2020. If you see content that you believe violates YouTube’s policies, follow the process outlined by YouTube’s support team.
What will YouTube’s change mean for the 2024 US presidential election?
Most of YouTube’s election misleading information policy remains unchanged; The update doesn’t stop YouTube from choosing to remove content that the platform determines spreads misinformation about future elections in the US or anywhere in the world.