Wednesday, November 21, 2012

Nassim Taleb on AntiFragility: 5 Rules Where Society can Benefit from Volatility

At the Wall Street Journal, my favorite iconoclast Black Swan theorist and author Nassim Nicolas Taleb explains his 5 rules where society can benefit from randomness, volatility or anti-fragility

Definition of fragility and antifragility:
Fragility is the quality of things that are vulnerable to volatility. Take the coffee cup on your desk: It wants peace and quiet because it incurs more harm than benefit from random events. The opposite of fragile, therefore, isn't robust or sturdy or resilient—things with these qualities are simply difficult to break.

To deal with black swans, we instead need things that gain from volatility, variability, stress and disorder. My (admittedly inelegant) term for this crucial quality is "antifragile." The only existing expression remotely close to the concept of antifragility is what we derivatives traders call "long gamma," to describe financial packages that benefit from market volatility. Crucially, both fragility and antifragility are measurable.

As a practical matter, emphasizing antifragility means that our private and public sectors should be able to thrive and improve in the face of disorder. By grasping the mechanisms of antifragility, we can make better decisions without the illusion of being able to predict the next big thing. We can navigate situations in which the unknown predominates and our understanding is limited.
Mr. Taleb’s five rules accompanied by excerpted elucidations (italics mine)
Rule 1: Think of the economy as being more like a cat than a washing machine.

We are victims of the post-Enlightenment view that the world functions like a sophisticated machine, to be understood like a textbook engineering problem and run by wonks. In other words, like a home appliance, not like the human body. If this were so, our institutions would have no self-healing properties and would need someone to run and micromanage them, to protect their safety, because they cannot survive on their own.

By contrast, natural or organic systems are antifragile: They need some dose of disorder in order to develop. Deprive your bones of stress and they become brittle. This denial of the antifragility of living or complex systems is the costliest mistake that we have made in modern times. Stifling natural fluctuations masks real problems, causing the explosions to be both delayed and more intense when they do take place. As with the flammable material accumulating on the forest floor in the absence of forest fires, problems hide in the absence of stressors, and the resulting cumulative harm can take on tragic proportions…

Rule 2: Favor businesses that benefit from their own mistakes, not those whose mistakes percolate into the system.

Some businesses and political systems respond to stress better than others. The airline industry is set up in such a way as to make travel safer after every plane crash. A tragedy leads to the thorough examination and elimination of the cause of the problem. The same thing happens in the restaurant industry, where the quality of your next meal depends on the failure rate in the business—what kills some makes others stronger. Without the high failure rate in the restaurant business, you would be eating Soviet-style cafeteria food for your next meal out.

These industries are antifragile: The collective enterprise benefits from the fragility of the individual components, so nothing fails in vain…

Rule 3: Small is beautiful, but it is also efficient.

Experts in business and government are always talking about economies of scale. They say that increasing the size of projects and institutions brings costs savings. But the "efficient," when too large, isn't so efficient. Size produces visible benefits but also hidden risks; it increases exposure to the probability of large losses. Projects of $100 million seem rational, but they tend to have much higher percentage overruns than projects of, say, $10 million. Great size in itself, when it exceeds a certain threshold, produces fragility and can eradicate all the gains from economies of scale. To see how large things can be fragile, consider the difference between an elephant and a mouse: The former breaks a leg at the slightest fall, while the latter is unharmed by a drop several multiples of its height. This explains why we have so many more mice than elephants…

Rule 4: Trial and error beats academic knowledge.

Things that are antifragile love randomness and uncertainty, which also means—crucially—that they can learn from errors. Tinkering by trial and error has traditionally played a larger role than directed science in Western invention and innovation. Indeed, advances in theoretical science have most often emerged from technological development, which is closely tied to entrepreneurship. Just think of the number of famous college dropouts in the computer industry.

But I don't mean just any version of trial and error. There is a crucial requirement to achieve antifragility: The potential cost of errors needs to remain small; the potential gain should be large. It is the asymmetry between upside and downside that allows antifragile tinkering to benefit from disorder and uncertainty…

Rule 5: Decision makers must have skin in the game.

At no time in the history of humankind have more positions of power been assigned to people who don't take personal risks. But the idea of incentive in capitalism demands some comparable form of disincentive. In the business world, the solution is simple: Bonuses that go to managers whose firms subsequently fail should be clawed back, and there should be additional financial penalties for those who hide risks under the rug. This has an excellent precedent in the practices of the ancients. The Romans forced engineers to sleep under a bridge once it was completed…
Read the rest here

In complex systems or environments, it is a mistake to see the world as operating mechanically like a ‘textbook engineering problem’ where any presumption of knowledge applied through social policies only leads to greater volatility and risks. 

Differently said, social policies that have been averse to change or designed to eliminate change leads to unintended consequences. Economist David Friedman’s take on the mistake of adhering to the change averse "precautionary principle" rhymes with Mr. Taleb’s antifragile concepts.

Also the idea of centralization only concentrates systemic risks and volatility. Whereas decentralization not only distributes and reduces the impact of volatility but also encourages innovation and thus progress.

Bottom line: Randomness, volatility and antifragility is part of human life. Society would benefit more by learning and adapting. Decentralized institutions are more suited to deal with antifragility. Presuming away the reality of change, which has been embraced by populist politics, only defeats the 'feel good' and ‘noble’ intentions of such social policies.

No comments: