Photo taken on March 16, 2019 shows cards and flowers people placed to mourn the victims of the attacks on two mosques in Christchurch, New Zealand. (Xinhua/Guo Lei)
New Zealand Prime Minister Jacinda Ardern on Sunday said she would be looking for answers from Facebook and other social media firms about how an attack that killed 50 mosque-goers was livestreamed on their platforms.
"Certainly, I have had contact from Sheryl Sandberg. I haven't spoken to her directly but she has reached out, an acknowledgment of what has occurred here in New Zealand," Ardern said at a media conference when asked if Facebook should stop livestreaming.
Sheryl Sandberg, Facebook's chief operating officer, has sent condolences over the shootings at two mosques that killed 50 people, some of which were livestreamed over the social media platform.
The horrific video shot by the gunman who carried out the mosque massacre was livestreamed on Facebook before being removed by the company. But the stream, lasting 17 minutes, was shared repeatedly on Twitter, Alphabet Inc's YouTube, Facebook-owned Whatsapp and Instagram.
Internet platforms were scrambling to remove videos being reposted of the gruesome scenes.
"We did as much as we could to remove, or seek to have removed, some of the footage that was being circulated in the aftermath of this terrorist attack," Ardern said. "But ultimately it has been up to those platforms to facilitate their removal. I do think that there are further questions to be answered."
Facebook, Twitter and YouTube all said they had taken steps to remove copies of the videos. Facebook said it had deleted the gunman's accounts "shortly after the livestream commenced," after being alerted by police.
In a statement on Sunday, Mia Garlick of Facebook New Zealand vowed to "work around the clock to remove violating content".
"In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload," the company said.
Ardern was joined by Australian Prime Minister Scott Morrison in expressing doubts that current rules go far enough.
Morrison said that social media companies had co-operated since the attack. "But I sadly have to say that the capacity to actually assist fully is very limited on the technology side."
He said "assurances were given" that once such content was pulled down, a regime would make sure it did not go back up. "Clearly it hasn't (happened)."
Videos of the shooting appeared on all five platforms up to 10 hours after the attacks, which began at 13:45 local time in the city of Christchurch, Reuters found.
And in a 15-minute window, it also found five copies of the footage on YouTube uploaded under the search term "New Zealand" and tagged with categories including "education" and "people & blogs."
In another case, the video was shared by a verified Instagram user in Indonesia with more than 1.6 million followers.
"So I think there are some very real discussions that have to be had about how these facilities and capabilities as they exist on social media, can continue to be offered."
The shootings in New Zealand show how the services tech giants offer can be exploited by extremist groups, said Lucinda Creighton, senior adviser to the Counter Extremism Project.
"Extremists will always look for ways to utilize communications tools to spread hateful ideologies and violence," she said. "Platforms can't prevent that, but much more can be done by platforms to prevent such content from gaining a foothold and spreading."
(With inputs from agencies)