YouTube to ban hateful, discriminatory videos
June 5, 2019YouTube announced Wednesday it was enforcing a ban on videos that promote racism and discrimination.
The Google-owned streaming service said that content that glorifies ideologies like Nazism, white supremacy and other extremist views had no place on its platform.
The new YouTube policy will prohibit "videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, cast, religion, sexual orientation or veteran status," the company said in a statement.
YouTube will also remove existing content that denies well-documented violent events like the Holocaust; US school shootings, or the September 11, 2001 terrorist attacks.
"YouTube has always had rules ahead of the road, including a longstanding policy against hate speech," a company underlined.
The move comes amid a series of measures taken by tech giants to filter our hateful and violent content, which have triggered calls for tougher internet regulation.
Read more: YouTube turns off comments for videos showing kids
"We will begin enforcing this updated policy today; however, it will take time for our systems to fully ramp up and we'll be gradually expanding coverage over the next several months," YouTube said on Wednesday.
The streaming service also said it was making changes to its algorithm that recommends videos to its users.
Read more: EU hails social media crackdown on hate speech
A curb on free speech?
YouTube, however, acknowledged that its new policy would affect the work of researchers who analyze videos "to understand hate in order to combat it."
Internet restrictions are also opposed by advocates of free speech, who argue that all kinds of ideas should be allowed in the public sphere.
Jonathan Greenblatt, chief executive of the Anti-Defamation League, which researches anti-Semitism, said the move alone was insufficient to curb discrimination. "Many more changes from YouTube and other tech companies" would be needed to "adequately counter the scourge of online hate and extremism," Greenblatt said in a statement.
Read more: Bundestag passes law to fine social media companies for not deleting hate speech
Social media giants, including Facebook and Twitter, were heavily criticized after the March 15 terror attacks on New Zealand mosques for their perceived inactivity in dealing with material livestreamed by the suspect.
Following the attacks, which saw 51 people killed, Facebook admitted that it had not done enough. The attacker livestreamed the rampage on Facebook for 17 minutes before the company removed it. Clips from the stream had already gone viral. An Australian white supremacist was charged with murder and terrorism
Facebook subsequently announced that it would ban praise or support of white nationalism and white separatism as part of a crack down on hate speech.
Read more: Facebook to tighten livestream access after Christchurch attacks
shs/rt (Reuters, AFP, dpa)