ChCHStrap

An attempt by the social media giant Facebook to mollify community leaders and families ahead of the anniversary of the Christchurch shootings appears to have fallen flat.

READ MORE:
Six months on from the Christchurch shootings, a look at what's changed on social media - and what hasn't

The United States company flew in managers from Australia and Singapore for the closed, invitation-only meeting, which was held at Canterbury University on February 17.

Those attending included ethnic community leaders, Martin Cocker, the head of NetSafe (the lead agency for the complaints laid under the Harmful Digital Communications Act), Facebook Australia/New Zealand policy director Mia Garlick, and Aliya Danzeisen, of the NZ Islamic Women's Council and lead coordinator of the Women's Organisation of the Waikato Muslim Association, or WOWMA.

Advertisement

She was allowed to bring along Shadia Amin and Dr Maysoon Salama to represent victims' families.

One insider said the meeting became "heated".

Danzeisen's take is more nuanced.

"There were some strong discussions," she said.

The general mood of the room was positive overall, but some - including the family representatives - left feeling that while Facebook had made some progress, it was not enough and it should be doing more.

Cocker also said the mood was constructive, but that also "some of the feedback to Facebook was pretty direct."

"There has been good progress against violent content on Facebook and other platforms, but there hasn't been a heck of a hot of progress against hate pages that can ignite it," the Netsafe head told the Weekend Herald.

"There are lots and lots of anti-Islamic pages that are highly offensive, but don't have violent content."

Advertisement

Cocker said many at the meeting voiced frustration that, despite Facebook tightening its rules and definitions, it was still too hard to get hate content taken down amid arguments over what constituted hate content, and what was protected by free speech.

He said the regional Facebook managers at the meeting were informative and receptive. "They weren't making any excuses," he said. They were empathetic, but Cocker added that if people at the meeting were looking for more action or an apology "it needs to come from higher up."

Netsafe's Martin Cocker:
Netsafe's Martin Cocker: "There has been good progress against violent content on Facebook, but there hasn't been a heck of a hot of progress against hate pages that can ignite it." Photo / Sarah Ivey

Facebook measures since March 15 have included beefing up its filtering systems, and banning many nationalist groups.

But a key concern for people at the meeting was that hate content was still appearing, and that it often fell on the local Muslim community to report it. Facebook still did not seem proactive enough in filtering or removing content.

"They should not defer responsibility. Communities should not be the front-line for monitoring everything. Their systems still rely too much on us," Danzeisen said.

"I'd like to sit down with Mark Zuckerberg," she said. " I just don't think he realises what the impact is on the average Muslim kid - or adult - of the sustained abuse."

After the Christchurch shootings, YouTube effectively eliminated the option to livestream for most users. The Google-owned video-sharing service banned mobile streaming bar for registered users with more than 1000 followers.

Danzeisen said she would like Facebook to follow suit, or at least introduce a slight delay.

In April last year, in his first interview after the mosque massacres, Facebook CEO and founder Mark Zuckerberg said he opposed a ban on streaming, or a delay.

"It would fundamentally break what livestreaming is for people," he said at the time.

"Most people are livestreaming, you know, a birthday party or hanging out with friends when they can't be together. And it's one of the things that's magical about livestreaming is that it's bi-directional, right? So you're not just broadcasting. You're communicating. And people are commenting back. A delay would break that."

Facebook did subsequently tweak its rules so that individuals who "violate our most serious policies" could be temporarily or permanently banned from livestreaming.

Danzeisen said if there was a ban or delay on public livestreaming, events like a birthday could still be streamed live to an invite-only closed Facebook group.

Like Privacy Commissioner John Edwards, she criticised Facebook for, as she saw it, making new features live before they could be properly monitored. "You wouldn't put a car on the road until you knew it was safe," she said.

Another problem: copies of the alleged Christchurch shooter's video continue to appear online.

After the international "Christchurch Call" summit in Paris in May last year, driven by NZ Prime Minister Jacinda Ardern, Facebook said it would collaborate with a number of top US universities on a US$7.5 million ($11.8m) project to research systems that could better detect harmful contact.

New York-based hate-content researcher Eric Feinberg has been constantly able to find copies of the gunman's clip. On January 29 this year, he found 14 copies of the gunman's raw footage across Facebook and Facebook-owned Instagram.

The alleged gunman's video was rated objectionable by the Chief Censor, along with his "manifesto." This makes the material illegal to view or share.

In June last year, Christchurch man Philip Neville Arps was sentenced to 21 months for sharing the clip. He was released on January 29, with a GPS tracker and condition he did not go near the two mosques.

Facebook ANZ policy director Garlick said: "Since March 15, we've made significant changes and investments to advance our proactive detection technology, grow our teams working on safety, and respond even quicker to acts of violence.

"No single solution can prevent hate and violent extremism, but the meaningful progress on the commitments made to the Christchurch Call are delivering real action in New Zealand and internationally."

The company regularly met with community leaders, Garlick said .

Between July and September last year, Facebook says it took down:

• 7m pieces of hate speech, 80.2% proactively before it was reported (up from 53% this time the year prior)

• 29.3m pieces of graphic violent content, 98.6% proactively before it was reported; and

• 5.2m million pieces of terrorist propaganda, 98.5% proactively before it was reported

Danzeisen stressed she was not totally opposed to Facebook.

"They are trying, in some areas," she said.

She often used it herself in her community and youth work.

"There are a lot of positives. We use it to create more community engagement and involvement. For example to organise a fishing trip, or a camp, and to share what we did. People new to our community or new to New Zealand find us through Facebook. It definitely helps us in that way - but people who want to hurt us can find us too."

Where to get help:
Lifeline: 0800 543 354 (available 24/7)
Whats Up?: 0800 942 8787 (1pm to 11pm)
Depression helpline: 0800 111 757 (available 24/7)
Youthline: 0800 376 633
Kidsline: 0800 543 754 (available 24/7)
If it is an emergency and you feel like you or someone else is at risk, call 111.