Facebook CEO Mark Zuckerberg has finally given his first interview on the Christchurch shootings.
Zuckerberg told ABC News that the mosque massacres, which saw 50 people slain as the gunman livestreamed to Facebook, "a really terrible event."
But the Facebook founder had mixed feelings about a delay, just as "live" TV is on a slight delay, when the idea was suggested by interviewer George Stephanopoulos if it would help.
"You know, it might, in this case," Zuckerberg replied.
"But it would also fundamentally break what livestreaming is for people. Most people are livestreaming, you know, a birthday party or hanging out with friends when they can't be together. And it's one of the things that's magical about livestreaming is that it's bi-directional, right? So you're not just broadcasting. You're communicating. And people are commenting back. So if you had a delay that would break that."
Earlier this week, Privacy Commissioner John Edwards criticised Facebook for not introducing any new safeguards that would prevent a repeat of the March 15 livestream.
In a March 30 open letter, Facebook COO Sheryl Sandberg said the social network was "exploring" the idea of livestreaming restrictions for users who violated the social network's community standards.
Yesterday, as Australia's Parliament passed a tough new law that could see social media companies fined up to 10 per cent of their revenue and their executives jailed for up to three years if they fail to take "swift" action on "abhorrent" content, Attorney-General Christian Porter said, "There are platforms such as YouTube, Twitter and Facebook who do not seem to take their responsibility to not show the most abhorrently violent material seriously."
A partial transcript of the interview is below. Read the full transcript here.
STEPHANOPOULOS: Do you think that social media has made acts of extreme violence more prevalent?
ZUCKERBERG: It's hard to say. I think that that's going to be something that's studied for a long time. I certainly haven't seen data that would suggest that it has. And I think that the hope is that by giving everyone a voice, you're creating a broader diversity of views that people can have out there and that even if sometimes that surfaces some ugly views, I think that the Democratic tradition that we have is that you want to get those issues on the table, so that way, you can deal with them. Certainly, though, this is why I care so much about issues like policing harmful content and hate speech, right? I don't want our work to be something that gets towards amplifying really negative stereotypes or promoting hate. So that's why we're investing so much in building up these AI systems. Now, we have 30,000 people who are doing content and security review to do as best of a job as we can of proactively identifying and removing that kind of harmful content.
STEPHANOPOULOS: What did you learn from the New Zealand experience, a few weeks back? It took about an hour to take down that live video. Clearly, it seemed like this was intended to happen on social media. What did you learn about it? What more can be done to stop it?
ZUCKERBERG: Yeah, I mean, that was a really terrible event. And we've worked with the police in New Zealand, and we still do. There were a couple of different parts of that where I think we learned. The first was in the live video, itself. I actually think the bigger part of it was -- is the second, which is all of the copies that got uploaded afterwards.
STEPHANOPOULOS: Like that, yeah.
ZUCKERBERG: So the live video itself was seen about 200 times while it was live. Most of those, it seems, were from people in a different online community, off Facebook, that this terrorist, basically, told that he was about to go do this. So they went. And a lot of those views were copying the video, so that way they could upload it a lot of times. So one of the big takeaways from that is we need to build our systems to be more advanced, to be able to identify livestream terror events more quickly, as it's happening, which is a terrible thing.
STEPHANOPOULOS: Would a delay help, any delay of livestreaming?
ZUCKERBERG: You know, it might, in this case. But it would also fundamentally break what livestreaming is for people. Most people are livestreaming, you know, a birthday party or hanging out with friends when they can't be together. And it's one of the things that's magical about livestreaming is that it's bi-directional, right? So you're not just broadcasting. You're communicating. And people are commenting back. So if you had a delay that would break that.
STEPHANOPOULOS: Even 10, 15, 20 -- we have seven-second delays on TV.
ZUCKERBERG: But you're not getting comments that are...
STEPHANOPOULOS: No, we get a lot of comments. But you're right.
ZUCKERBERG: (LAUGHS) Well yeah, but afterwards. But going back to the point -- one of the things that we saw was people, basically, of those 200 people who saw this video while it was live, a lot of them copied it and then made different versions of the video that they tried to upload. In the first 24 hours, our systems automatically took down 1.2 million copies of that video that people were trying to upload. Then we took down another 300,000 that people flagged to us that our systems didn't catch proactively. But one of the things that this flagged for me, overall, was the extent to which bad actors are going to try to get around our systems. It wasn't just one copy of the video. There were 800 different versions of that video that people tried to upload. And often, they made slightly different versions of it to try to get around our systems that would catch the videos, as people tried to upload them. So it gets back to some of these issues around policing harmful content, around preventing election interference. These aren't things that you ever fully solve, right? They're ongoing arms races, where we need to make sure that our systems stay ahead of the sophisticated bad actors, who are just always going to try to game them. And that's just part of the dynamic that we're in. And we need to always keep on investing more to stay ahead of that.
Read the full transcript here.