Nasim Talib: Several years before the financial crisis descended on us, I put forward the concept of "black swans": large events that are both unexpected and highly consequential. We never see black swans coming, but when they do arrive, they profoundly shape our world: Think of World War I, 9/11, the Internet, the rise of Google GOOG -0.61% .
...[O]ur tools for forecasting and risk measurement cannot begin to capture black swans. Indeed, our faith in these tools make it more likely that we will continue to take dangerous, uninformed risks.
...If we could not measure the risks of potential blowups, what were we to do? The answer is simple: We should try to create institutions that won't fall apart when we encounter black swans—or that might even gain from these unexpected events.''
To deal with black swans, we instead need things that gain from volatility, variability, stress and disorder...
Rule 1: Think of the economy as being more like a cat than a washing machine. We are victims of the post-Enlightenment view that the world functions like a sophisticated machine, to be understood like a textbook engineering problem and run by wonks...
By contrast, natural or organic systems are antifragile: They need some dose of disorder in order to develop. Deprive your bones of stress and they become brittle. This denial of the antifragility of living or complex systems is the costliest mistake that we have made in modern times. Stifling natural fluctuations masks real problems, causing the explosions to be both delayed and more intense when they do take place...
Rule 2: Favor businesses that benefit from their own mistakes, not those whose mistakes percolate into the system. The airline industry is set up in such a way as to make travel safer after every plane crash. A tragedy leads to the thorough examination and elimination of the cause of the problem. The same thing happens in the restaurant industry...Without the high failure rate in the restaurant business, you would be eating Soviet-style cafeteria food for your next meal out...
By contrast, every bank failure weakens the financial system, which in its current form is irremediably fragile: Errors end up becoming large and threatening. A reformed financial system would eliminate this domino effect, allowing no systemic risk from individual failures. A good starting point would be reducing the amount of debt and leverage in the economy and turning to equity financing...
A firm with equity financing can survive drops in income, however. Consider the abrupt deflation of the technology bubble during 2000. Because technology firms were relying on equity rather than debt, their failures didn't ripple out into the wider economy. Indeed, their failures helped to strengthen the technology sector.
Rule 3: Small is beautiful, but it is also efficient...To see how large things can be fragile, consider the difference between an elephant and a mouse: The former breaks a leg at the slightest fall, while the latter is unharmed by a drop several multiples of its height. This explains why we have so many more mice than elephants.
So we need to distribute decisions and projects across as many units as possible, which reinforces the system by spreading errors across a wider range of sources. In fact, I have argued that government decentralization would help to lower public deficits...
Rule 4: Trial and error beats academic knowledge...Things that are antifragile love randomness and uncertainty, which also means—crucially—that they can learn from errors. Tinkering by trial and error has traditionally played a larger role than directed science in Western invention and innovation. Indeed, advances in theoretical science have most often emerged from technological development, which is closely tied to entrepreneurship. Just think of the number of famous college dropouts in the computer industry...
Perhaps because of the success of the Manhattan Project and the space program, we greatly overestimate the influence and importance of researchers and academics in technological advancement. These people write books and papers; tinkerers and engineers don't, and are thus less visible.
Rule 5: Decision makers must have skin in the game. At no time in the history of humankind have more positions of power been assigned to people who don't take personal risks. But the idea of incentive in capitalism demands some comparable form of disincentive. In the business world, the solution is simple: Bonuses that go to managers whose firms subsequently fail should be clawed back, and there should be additional financial penalties for those who hide risks under the rug. This has an excellent precedent in the practices of the ancients. The Romans forced engineers to sleep under a bridge once it was completed...
No comments:
Post a Comment