Data Is Useless Without Context.
We face danger whenever information growth outpaces our understanding of how to process it. The last forty years of human history imply that it can still take a long time to translate information into useful knowledge, and that if we are not careful, we may take a step back in the meantime.
The most calamitous failures of prediction usually have a lot in common. We focus on those signals that tell a story about the world as we would like it to be, not how it really is. We ignore the risks that are hardest to measure, even when they pose the greatest threats to our well-being. We make approximations and assumptions about the world that are much cruder than we realize. We abhor uncertainty, even when it is an irreducible part of the problem we are trying to solve.
In science, progress is possible. In fact, if one believes in Bayes’ theorem, scientific progress is inevitable as predictions are made and as beliefs are tested and refined.
Under Bayes’ theorem, no theory is perfect. Rather, it is a work in progress, always subject to further refinement and testing.
A forecaster should almost never ignore data, especially when she is studying rare events like recessions or presidential elections, about which there isn’t very much data to begin with. Ignoring data is often a tip-off that the forecaster is overconfident, or is overfitting her model – that she is interested in showing off rather than trying to be accurate.
Absolutely nothing useful is realized when one person who holds that there is a 0 percent probability of something argues against another person who holds that the probability is 100 percent.
Who needs theory when you have so much information? But this is categorically the wrong attitude to take toward forecasting, especially in a field like economics where the data is so noisy.
The litmus test for whether you are a competent forecaster is if more information makes your predictions better.
All models are wrong, but some models are useful.”90 What he meant by that is that all models are simplifications of the universe, as they must necessarily be.
As John Maynard Keynes said, “The market can stay irrational longer than you can stay solvent.
Successful gamblers, instead, think of the future as speckles of probability, flickering upward and downward like a stock market ticker to every new jolt of information.
Laplace’s Demon:.
For Popper, a hypothesis was not scientific unless it was falsifiable – meaning that it could be tested in the real world by means of a prediction.
How can we apply our judgment to the data – without succumbing to our biases?
Human beings have an extraordinary capacity to ignore risks that threaten their livelihood, as though this will make them go away.
There is a tendency in our planning to confuse the unfamiliar with the improbable. The contingency we have not considered seriously looks strange; what looks strange is thought improbable; what is improbable need not be considered seriously.
Meanwhile, if the quantity of information is increasing by 2.5 quintillion bytes per day, the amount of useful information almost certainly isn’t. Most of it is just noise, and the noise is increasing faster than the signal. There are so many hypotheses to test, so many data sets to mine – but a relatively constant amount of objective truth.
After adjusting for inflation, a $10,000 investment made in a home in 1896 would be worth just $10,600 in 1996.
The most basic tenet of chaos theory is that a small change in initial conditions – a butterfly flapping its wings in Brazil – can produce a large and unexpected divergence in outcomes – a tornado in Texas.