Do you feel lucky punk? Or do you just feel overly confident about assessing the odds.
You're probably not as good as you think. But if you are any good, you'll already know how bad you are.
Recognising our inadequacies at assessing risk is the key theme of Risk Intelligence by Dylan Evans - my pick for business book of the year.
It's really a book on behavioural psychology and as such is blissfully free of business jargon. But as a primer into the relatively new academic field of risk management, it is fascinating.
Risk assessment is at the heart of good investment decision-making; it is at the heart of all good decision-making. It is a fundamental skill for business success.
Evans' definition of "risk intelligence" is "the ability to estimate probabilities accurately". He has developed a test (which you can do online at www.projectionpoint.com).
His book is a great piece of popular science writing drawing on examples across modern life including politics, gambling and business.
History suggests that New Zealanders aren't great at assessing risk particularly - as finance writers aren't shy of pointing out - in the investment sphere. Perhaps it runs deeper than that - look at our drink-driving stats or teen pregnancy rates, or even more socially acceptable risk-taking such as mad adventure sports.
Risk is a very difficult thing to assess accurately. Some people seem to have an innate ability. Our richest people often have the skill at the core of their success.
I once asked Graeme Hart about the risky nature of his highly leveraged business strategies. He told me he had a serious aversion to risk. That's why he invests in low-risk businesses such as packaging and staple foods - some of the least volatile industry sectors.
In his view he's playing a low-risk, predictable game, it just happens to be on an enormous scale.
The debt required to get him to that scale might seem risky to you and me, but he considers it a variable over which he has some control.
Humans generally are not great at assessing risks. Animals do it innately, whereas humans let their theories and biases cloud the logic. Research has proved it.
Physicist Leonard Mlodinow writes of an experiment where a light is set up to randomly flash either red or green.
The subject is rewarded for predicting the next colour in the sequence. Rats and humans were both tested.
Overall, the green light was set to flash 75 per cent of the time. The rats quickly worked this out and started picking green every time - ensuring a 75 per cent success rate and sugary treats.
Humans, embarrassingly, had a success rate of about 60 per cent.
The bigger brained homo sapiens spends most of the experiment trying to see patterns in the sequencing of the lights that don't exist.
Pigeons have also been shown to outperform humans in experiments that involve assessing odds.
The first step in improving your risk assessment is recognising your biases.
I ride a motor scooter to work. On Auckland streets that's probably more dangerous than sky diving in to the office. I can ride my bike on the motorway or I can ride the ordinary streets.
After some hair-raising experiences I worked out I was far less likely to have a crash on the motorway as the traffic moves in one direction and there is no one pulling out from side streets or pulling mad u-turns.
But, of course, if I do crash at 100km/h I am far more likely to die. The stakes are too high on the motorway so despite my bias - an aversion to crashing at all - I'm back braving the side streets.
To highlight the complexities, Evans recalls the words of former US Secretary of Defence Donald Rumsfeld.
In a 2002 speech about Iraq and the risk around of weapons of mass destruction, he referred to "known knowns" and "unknown unknowns".
Rumsfeld was lambasted for mangling his English. Putting aside the controversy of his topic, the terminology he used was valid.
Risk can be divided into four categories which relate to our available information and our understanding of its relevance.
We can be aware something is a risk yet lack the relevant data to deal with it - this is a "known unknown".
Or we can possess highly valuable information but be blissfully unaware that it is relevant to our problem - this is an "unknown known".
If we have the information we need and also know it is relevant then we are fortunate to be dealing with a "known known".
Then there are those "unknown unknowns" - the issues where we have no relevant information and wouldn't be aware it was relevant even if we had it.
Awareness about the limits of our knowledge gives us an edge, says Evans.
Think deeper about what could go wrong, ask enough questions and we can turn "unknown unknowns" in to "known unknowns".
From there we can start to seek the specific information we need and ultimately deal more safely with risks that are "known knowns".
Finance company investors in New Zealand were up against a lack of good information, which was either poorly presented or in some cases criminally concealed.
But in many cases there was a serious lack of knowledge about the relevant questions that needed to be asked.
We all need to think more about risk.
We should talk about it in schools from an early age. It goes hand in hand with the maths we learn about probability.
We just have to bring the examples into the real world.
Let's recognise we are bad at risk assessment and in doing so be better at it.
* Risk Intelligence: How to live with uncertainty by Dylan Evans
(Free Press 2012)