15 Mar, 2019FORTUNE.COM
Facebook has removed a livestreamed video that reportedly showed the terror attacks that took place Friday afternoon in Christchurch, New Zealand. Other platforms were reportedly more tardy, bringing the spotlight back onto social media firms’ capacity to quickly remove extremist content from their platforms.
Around the time of Friday prayers, gunmen opened fire in two Christchurch mosques, killing at least 49 people. Explosive devices were found on one of the attackers’ vehicles. Four people have been arrested, and one has been confirmed by police as being an Australian citizen.
One of the gunmen, who reportedly published a manifesto expressing white-supremacist motivations, streamed his attack online, reportedly from a head-mounted camera. He uploaded it to Facebook, which scrubbed the 17-minute video once it had been flagged up by the police.
“Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter’s Facebook and Instagram accounts and the video,” Facebook said. “We’re also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware.”
However, the video had already spread. YouTube and Twitter took hours to remove it from their platforms, according to Bloomberg. Some media outlets also repeatedly broadcast clips from the video, against the police’s urging.
Fortune has asked YouTube and Twitter for comment on the reports.
Police told reporters that they were also investigating a report that a Facebook page had carried a warning of the attack, in the form of a threat against the Muslim community.
New Zealand does not have an extensive history of mass shootings--the worst before this took 13 lives, three decades ago. Its gun laws require those purchasing firearms to be at least 16 years old and pass vetting, but do not require the registration of weapons.