Within hours of the assassination of a history teacher by an 18-year-old Islamist in France on Friday, fingers were pointed at social media platforms for having helped motivate the killer before he decapitated Samuel Paty and then for allowing him to gruesomely claim responsibility moments afterwards.
"Things began on social media and they ended on social media," said Gabriel Attal, the French government spokesman. "We have to do better at bringing them under control."
Paty's fellow-teachers at the school in Conflans-Saint-Honorine near Paris expressed "deep concern about the impact of social media" in a joint statement on Tuesday. They called the speed and irreversibility of the messages broadcast "a real plague for the exercise of our profession".
Marlène Schiappa, minister for citizenship, summoned representatives of social media groups, including Facebook-Instagram, Twitter, Google-YouTube, TikTok and Snapchat, to a meeting on "cyber-Islamism" on Tuesday and demanded they take responsibility for content on their platforms.
Big social media platforms such as Facebook and Twitter were already under pressure in Europe, the US and Asia to curb the spread of fake news and hate speech and to stop turning a blind eye to the promotion of violence.
Last year, the platforms pledged to boost their moderation capabilities and introduced new hate speech policies, after a white supremacist killed 51 people in an attack on two mosques in New Zealand and livestreamed the footage via Facebook's Live service.
More broadly, Facebook and YouTube have been criticised for helping extremist groups recruit, radicalise and organise — because their algorithms tend to push users towards provocative and eye-catching content.
Lately, concerns have centred on the rise on the platforms of armed militias in the US ahead of the presidential election, and of pro-Trump conspiracy group QAnon.
Officials and politicians say the killing of Paty will inevitably accelerate legislation in France and the EU designed to hold social media platforms responsible for the sometimes inflammatory content posted by their users.
Video: Is QAnon a game gone wrong? | FT FilmInvestigators are still trying to piece together the sequence of events that led to Abdoullakh Anzorov, a Chechen refugee, hacking off the head of a teacher who had shown pupils caricatures of the Prophet Mohammed in a class about freedom of speech.
But the fact that the murderer, who was shot dead by the police, came all the way from Evreux 80km to the west suggests that he learnt of Paty and Muslim complaints about him from videos posted on the internet. The videos were widely disseminated, with some pupils and parents at the school complaining they had been sent them multiple times.
Brahim Chnina, the father of one of the pupils in the school, had posted three videos highly critical of Paty, demanding he be fired and calling on people to take action, and BFMTV reported that he had been in touch with the killer via WhatsApp in the days before the assassination.
At least one of the videos could still be seen on Mr Chnina's Facebook account on Monday evening. Mr Chnina made one of them with the help of Abdelhakim Sefrioui, an Islamist militant already categorised as a security risk by French intelligence. Both men have been detained.
After he had killed Paty, Anzorov sent a Twitter post with a picture of the severed head on the street addressed to President Emmanuel Macron, "leader of the infidels", and boasted of killing "one of your hell dogs who dared to denigrate Mohammed".
We must now treat dangerous content as a priority
Laetitia Avia, French MPAccording to the newspaper Le Monde, Anzorov in recent weeks sent 400 tweets from that account, @Ttchetchene_270. The account had been notified in July to Pharos, a government site where the public can report lawbreaking or other concerns about the internet.
Twitter declined to say when it had removed the account — it is no longer visible — and refused to make any other comment on the attack. However, the company has said that Twitter does not tolerate terrorism or terrorism content and that its teams act "proactively on this type of content and are in contact with law enforcement agencies in order to act as quickly as possible".
Facebook did not reply to requests for comment.
French leaders from Mr Macron down immediately announced plans to tighten controls on social media after what Mr Attal called the "public lynching" of Paty over the internet.
Gérald Darmanin, interior minister, said 80 investigations had been started since the attack into those who had sought to justify the murder or said the teacher "had it coming to him".
Ironically, Mr Macron's government had already finalised a law against internet hate in May, but its key clauses — including an obligation on social media networks to delete hateful content within 24 hours on pain of heavy fines, and a requirement for transparency — were struck down in June by the Constitutional Council on free-speech grounds.
Laetitia Avia, the member of parliament who drafted the law, described the killing of Paty as a tragedy which "reminds everyone that social media has been the terrain of dangerous content".
She told the Financial Times on Monday she was continuing to work on the issue both in France and in Brussels, where the European Commission is set to present its new digital services law in December.
One problem, she noted, was that traditional media were counted in French law as publishers, while social media networks were treated as neutral "hosters", even though they were really hybrids because their economic model meant they ranked and placed content to attract readers and viewers. Another problem was that apparently anodyne verbal violence was often a precursor to real violence.
"We must now treat dangerous content as a priority," she said.
- Financial Times