Facebook Inc. is considering placing restrictions on who can post live videos in the wake of a shooting in New Zealand earlier this month that was filmed and disseminated in real time.

The social media company came under sharp criticism for not taking the video down fast enough and for letting it be circulated across the internet and uploaded to other platforms like YouTube.

Facebook has said the original video, in which the alleged gunman killed 50 people in two mosques in Christchurch, had an audience of about 200 people during the 17 minutes of live broadcast.

Though Facebook took the video off the assailant's page 12 minutes after the livestream ended, it spread quickly as people shared and re-edited the footage, making it harder for the company's systems to block it.


"That's why we must work to continually stay ahead," Sheryl Sandberg, chief operating officer, said in a blog post on Instagram Friday.

"In the past week, we have also made changes to our review process to help us improve our response time to videos like this in the future."

Facebook may consider factors such as prior Community Standard violations in determining who is permitted to "go Live," Sandberg wrote. The company is also investing in research to build better technology to quickly identify edited versions of violent videos and images and preventing people from re-sharing them, she said. Facebook has identified more than 900 different videos showing portions of the original.

Facebook is also taking stronger steps to remove hate from its platform and is using artificial intelligence tools to identify and remove hate groups in Australia and New Zealand, Sandberg said.