What just happened? Social media sites have been battling the spread of the New Zealand terror attack video. Facebook alone removed 1.5 million copies of the clip in the first 24 hours of the shooting, 1.2 million of which were deleted at the upload stage.

Fifty people died and at least 20 more were injured in the shooting last Friday, which took place at two mosques in Christchurch, New Zealand. A gunman livestreamed the incident on Facebook using a head-mounted camera, and while the social network removed the video and the person's account, the 17-minute clip was still shared online.

Despite stopping 1.2 million copies of the video being uploaded, 300,000 made their way onto Facebook before being taken down. The site also blocked edited versions that removed the graphic footage, something Facebook said it did as a mark of respect to those affected and over "concerns of local authorities."

YouTube and Twitter have also been removing the video, and Reddit banned the notorious r/watchpeopledie subreddit in the wake of the massacres, citing a site policy on "glorifying violence." Even Valve has been forced into action, with the company removing profiles that were allegedly tributes to the shooter.

The perpetrator mentioned Pewdiepie in the livestream, something the YouTube star said he found "sickening."

This isn't the first time extreme acts of violence have been streamed to Facebook. In 2016, the shooting of police officers appeared on Facebook Live, and a Thai man broadcast himself on the platform murdering his 11-month-old daughter a year later.

In the wake of the attacks, New Zealand Prime Minister Jacinda Ardern has said she wants to discuss the subject of livestreaming with Facebook.