In the aftermath of the New Zealand mosque shooting, which was streamed live on Facebook, the social media company has said it removed 1.5 million videos of the attack from its site.
On Sunday, Facebook said 1.5 million videos of the attack were taken off the site in the first 24 hours after the Friday mass shooting at two Christchurch mosques left 50 people dead. The company said 1.2 million of those videos were blocked from upload but it’s not clear how many people watched the 300,000 videos that made it through the cracks before they were deleted.
When Brenton Tarrant, the 28-year-old alleged gunman, opened fire at Al Noor Mosque, he was recording. A 17-minute Facebook live video was the result. Facebook didn’t delete the original video until it was notified about it by police. But by then it had already been downloaded and shared countless times.
"New Zealand Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter's Facebook and Instagram accounts and the video," Mia Garlick, Facebook's director of policy for Australia and New Zealand, said in a statement on Friday after the shootings.
Garlick added that Facebook is "removing any praise or support for the crime and the shooter or shooters as soon as we're aware.”
Footage from the livestream also quickly spilled over to other social media platforms.
Jacqueline Helfgott, a professor of criminal justice at Seattle University, said that for some, social media can be a motivator when it comes to committing a crime.
“We are starting to see crimes being committed where social media is actually part of the M.O. and signature behavior of the crime. It’s used in a way in which it becomes integral part of the crime,” Helfgott told InsideEdition.com.
Helfgott said that particularly in terrorist attacks, using social media can fulfill the ultimate goal of terrorism.
“Social media provides a way to get an audience for terrorist types of crimes, which have the goal of producing terror in as many people as possible,” Helfgott said. “When people are Facebook living violent acts, part of their motive is performance and trying to get some sort of celebrity status out of violence and in the case of terrorism, having a way to terrify the most people with a click of a button.”
In May 2017, Facebook tried to limit the number of violent videos that made it onto the site by hiring 3,000 employees to sift through the content being uploaded and take down anything showing “murder, suicide and violence.”
“Given the importance of this, how quickly live video is growing, we wanted to make sure that we double down on this and make sure that we provide as safe of an experience for the community as we can,” Facebook CEO Mark Zuckerberg said in a statement at the time.
The company said it’s now “hashing” the New Zealand shooting video, which means using technology to detect when a video is being uploaded that is similar to the original shooting video and automatically removing it.
Authorities also asked members of the public to please not share the harrowing footage.
“There is extremely distressing footage relating to the incident in Christchurch circulating online,” New Zealand police wrote on Twitter Friday. “We would strongly urge that the link not be shared.”
Regardless of all the efforts of social media companies to monitor these types of violent videos, experts say the reality is that many people will continue to watch.
“It’s similar to why people drive by a car accident, people are just fascinated by the boundaries of human behavior but they are also vicarious victims,” Helfgott said. “The only way most people experience violent death is by watching these media mediated versions of it. It’s a form a psychologically processing.”