Maybe they could channel all posts through a worldwide pool of moderators and an algorithms could ensure the sender and moderator were not on each other's social network. The companies could afford to pay a million moderators a modest amount for each quick assessment. All blocked material could be further checked and unreasonable moderators dropped from the pool. The company would soon have a resource of sound judgment. And it would have some material it should refer to police.
Something such as that could be done. But nothing is likely to be done until it is the companies' financial interest to do it. The advertisers' boycott of media that carried the Christchurch murderer's footage is the sort of pain commerce understands.
The boycott needs to be sustained until the social media hosts devise an effective filter of some sort. Until then, advertisers on Facebook or Google's YouTube run the risk of their brands appearing on some similar, even copycat, outrage.
Deprived of an audience for his carnage, would the cowardly killer in Christchurch have gone into a mosque to shoot people in the back? Possibly - not many mass murderers have felt the need to film their moment. But this one clearly did and it is possible an internet filter would have discouraged him.
If it would not discourage him, it would at least reduce the risk that equally sick minds around the world could be excited by seeing it, and plan their own performance.
Social media are doing a great deal of good as well as evil. The large attendance at vigils planned at short notice for the victims in Christchurch attests to the power and reach of intersecting personal networks.
But those who design and maintain these platforms need to realise they are media with public responsibilities. Better they regulate themselves than let the clumsy hand of government decide what everybody may say.