The notion that Facebook was unable to detect the Christchurch mosque gunman's livestream because its contents were not "particularly gruesome" will be abhorrent to many of us.

Facebook's policy director for counter-terrorism this week reportedly told US Congress members that its algorithm did not detect the massacre livestream because there was "not enough gore".

The video, since declared an objectionable publication in New Zealand, was streamed for 17 minutes.

An algorithm is a preloaded trigger that detects, in this case, objectionable material, and shuts it down. In a way, it allows a computer to "think".

Advertisement

And in simpler terms, it is like the setting you can place on your computer, so that certain material does not show up when you or the kids do a Google search.

A few years ago, Facebook introduced counter-terror algorithms but on March 15, they didn't work.

I have a smidgen of empathy for Facebook. And it is only from the perspective that mainstream media also provides a platform for communication.

We do it through radio, print and online. And sometimes it can be difficult to control the views and opinions that are being expressed. The main area of difficulty? Facebook.

And this is where any empathy I have with Facebook evaporates into the digital ether.

Because mainstream media, having created multiple platforms for communication, regulates offensive content and comment.

Facebook, on the other hand, has created a monster it is now struggling to control.

In the early days after March 15, Facebook's silence around the gunman's use of its platform to livestream was telling. It was a silence that suggested they didn't care. Or know what to do.

All it would have taken were a few words to express disgust and a desire to prevent that disgusting act ever happen again.

It's a pity there isn't an algorithm to make the people who own Facebook more human.

Facebook this week also moved to explain what they are now doing, to try and prevent a repeat of the March 15 streaming.

Within Facebook's newsfeed, there are thousands of words describing the action being taken.

It looks comprehensive, but really, most of us just want to hear a few words - it won't happen again.