The secret United States National Security Agency documents leaked by Edward Snowden were ground-breaking and changed a great many things. The idea that we really need to encrypt every piece of data that goes over the internet because it'll be intercepted is one example.
In comparison, the Wikileaks-released "Vault 7" cache of Central Intelligence Agency documents, hack-ware and cyber nasties isn't in the same class.
Old, mundane, another diversion for the Trump administration set up by the Russians through Wikileaks, and thank you for the free software licenses, CIA (there were a number of those in the dump) and much of is isn't secret.
Maybe so, but there is still some useful things coming out of the Vault 7 release.
One issue that popped up with the CIA data dump is if it's okay for governments and intelligence agencies to amass knowledge of vulnerabilities and if so, when if ever should they disclose them to the public?
This is a really difficult question, as you have to weigh up where the public good lies. An intelligence agency finds a supermegamazing zero-day hack that nobody knows about, and uses it to find out information that saves people or prevents something nasty from happening is a good thing, right?
Ditto if such vulnerabilities stops governments from forcing the likes of Apple, Google and Microsoft from installing backdoors in their software and devices, a risky practice that security experts say shouldn't be done.
What if the vulnerabilities are in popular software and devices used in the country that the intelligence agency is trying to protect though? Let's say the the bug is in a critical system, like car engine management or driving aid computers, or in the control systems of an electricity grid. That's when it becomes difficult to weigh up what's right.
In the US, the government operates the ominous sounding Vulnerabilities Equities Process or VEP to work out that thorny issue.
That debate has become a great deal more complex however, as it's not just spies that find effective hacks. Others find the same vulnerabilities too. Well-known cyber security specialists Bruce Schneier and Trey Herr estimated that between 15 to 22 per cent of vulnerabilities are "rediscovered" by several people.
If Schneier and Herr's research is right, then as Snowden said, government hacking has become increasingly risky if a large number of vulnerabilities are found by others who could be cyber criminals.
Working out which vulnerability might be found by others as well as government hackers, and which therefore should be made public so it can be fixed, sounds like mission impossible, VEP notwithstanding.
Then there's the question of how necessary it is for governments to "stockpile" hacks.
So many vendors churn out utterly insecure software and hardware and don't support it with security patches after a while, or their fixes are of low quality and can introduce other vulnerabilities.
Even if they do release security patches, end-users are very good at not applying them. No fancy "0days" needed, just go through a list of old vulnerabilities for your targets and you'll hack them.
The above may seem geeky and esoteric, but in an era where IT security affects us all, we need to think about how governments should hack, and when they shouldn't.