"Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass produce. It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populations, warlords wishing to perpetrate ethnic cleansing.
"Autonomous weapons are ideal for tasks such as assassinations, destabilising nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity."
Well, no, it wouldn't be beneficial for humanity. Few arms races are. But are autonomous weapons really "the key question for humanity today"? Probably not.
We have a few other things on our plate that feel a lot more "key", like climate change, nine civil wars in the Muslim parts of the world (Afghanistan, Iraq, Syria, southeastern Turkey, Yemen, Libya, Somalia, Sudan and northeastern Nigeria) - and, of course, nuclear weapons.
The scientists and experts who signed the open letter were quite right to demand an international agreement banning further work on autonomous weapons, because we don't really need yet another high-tech way to kill people. It's not impossible that they might succeed, either, although it will be a lot harder than banning blinding laser weapons or cluster bombs.
But autonomous weapons of the sort currently under development are not going to change the world drastically. They are not "the third revolution in warfare, after gunpowder and nuclear arms", as one military pundit breathlessly described them. They are just another nasty weapons system.
As for weapons that kill people without a human being choosing the victims, those we have in abundance already. From land mines to nuclear-tipped missiles, there are all sorts of weapons that kill people without discrimination in the arsenals of the world's armed forces. We also have a wide variety of weapons that will kill specific individuals - such as guns.
Combine autonomous weapons with true AI, and you get the Terminator. Without that level of AI, all you get is another way of killing people that may, in certain circumstances, be more efficient than having another human being do the job. It's not pretty, but it's not new either.
The thing about autonomous weapons that really appeals to the major military powers is that, like the current generation of remote-piloted drones, they can be used with impunity in poor countries. Moreover, like drones, they don't put the lives of rich-country soldiers at risk. That's a really good reason to oppose them - and if poor countries realise what they are in for, a good opportunity to organise a strong diplomatic coalition that works to ban them.
Gwynne Dyer is an independent journalist whose articles are published in 45 countries