Volume 42, Number 5
March 21, 2018
By Hugh F. Kelly, PhD., CRE
The subject of “event risk” has been gaining in prominence in investment planning by institutional portfolio managers.1 Weather catastrophes, earthquakes, terrorism and other events that do not fit neatly into models of portfolio risk and return have, because of stark experience, become unavoidable topics for those accountable for managing money as investment fiduciaries.2 The subject is not universally treated, though, as a glance at discussions ranging from a crowd-sourcing site3 to a textbook chapter on risk in real estate show.4 Discussion of ‘black swans in real estate’ now appear on conference agendas in the U.S. and around the world. Nassim Nicolas Taleb’s book, The Black Swan, is subtitled, “The Impact of the Highly Improbable” and is part of an extended reflection on uncertainty, the nature of chance and change.5
I have typically assigned The Black Swan as required reading in my graduate courses in Risk and Portfolio Management. This may seem like a risky strategy in its own right, since Taleb disdains Mean-Variance Portfolio Theory – the basic core of institutional asset allocation practice and one of the central mathematical tools taught in such courses. Taleb indicts that theory, used by most of the leading practitioners of economics and finance, as “manufacturing phony premises to generate ‘rigorous’ theories.”6
This is not the only time that I invite the fox into the henhouse, by the way. Also on my required reading list for the course are The Myth of the Rational Market,7 When Genius Failed, 8 and the slim but incredibly important essay A Short History of Financial Euphoria.9
All of these are, to my mind, essential reading precisely because they challenge the limits of standard investment theory. They fit in, as a matter of fact, with one of the key principles of my own educational philosophy, namely, “There’s no point in doing research unless you are willing to be surprised.” If you think you already ‘know it all,’ why bother?
The acceptance that we still grasp much of our lived experience ‘through a glass, darkly’ and that our ability to forecast the future is imperfect and imprecise seems to be fairly obvious. upon reflection. It is worth reading books that remind us of this at any time, but never more so than in the serious study of statistical analyses that say they give us confidence in their results often down to the 95 % or 99% level of statistical significance. This can create degrees of confidence that will prove unjustified by events in the world.
Taleb says this overconfidence is folly. Who foresaw the impact of the arrival of the plague bacillus on European society, resulting in the death of 45% to 50% of Europe’s population in the six years from 1347 to 1353? Who could have accurately predicted the terrorism events of September 11, 2001, or natural events like the Indian Ocean tsunami that killed 280,000 people in 2004?
Taleb claims we suffer from a triple opacity:
- An illusion of understanding – we think we know more than we actually do
- A retrospective distortion – we tend to ‘explain’ causality only after the fact
- An overvaluation of information – both quantitatively and categorically
Against this set of intellectual deficiencies, The Black Swan10 promotes an approach that of “skeptical empiricism,” an approach that (like Socrates’)11 begins with constant questioning, with a resistance to broad theorizing, and one that sees unexpected events as the real occasions for learning.
The question for us today is: how do we put The Black Swan to work for us?
Let’s accept that it pushes us toward an insight that is true – even if it is not “the whole truth and nothing but the truth. After all, the same can be said for usual ways of monitoring, evaluating, and managing risk: our tools may have limitations to their utility, but they do have some proven usefulness and therefore can be said to get at certain functional truths,12 even if not “the whole truth and nothing but the truth.” And, lest you think I’m pushing this discussion into unreasonably philosophic territory, be aware that Taleb is very clear that his challenge is to an entire theory of knowledge, that his opposition is to “epistemic arrogance,” that his aim is a pragmatic epistemology of risk.
We are not far from the three fundamental questions identified by Immanuel Kant in the Critique of Pure Reason:
- What can I know?
- What ought I do?
- What may I hope?13
Let’s examine the triplet of opacity at the heart of Taleb’s black swan challenge: the illusion of understanding, the retrospective distortion, and the overvaluation of information, keeping in mind the pragmatic issues for real estate and the deeper human background of how our decisions impact our society.
The Illusion of Understanding
Let’s first take the 9/11 destruction of New York’s World Trade Center as a Black Swan event. Immediately after the horror and the shock of the terrorism and the awful loss of life settled into our awareness, real estate people – along with everyone else – tried to come to grips with what this “new era” would mean, and how the world had changed. Slamming commercial aircraft into office towers to make a political point would seem to be the very definition of “a highly improbable event with massively consequential effects.”
There was quickly a widespread consensus about what 9/11 was going to mean for high-rise office buildings – and the outlook was, in a word, dire.14
- It would be impossible to lease high floors in the tallest towers ever again.
- Firms would scatter their employees across multiple locations so that the organization would survive even if a considerable number of their workers did not.
- Downtown New York, in particular, was finished as an office location – and given the toxic cloud that rose from Ground Zero for months after 9/11, it was probably not going to be very desirable as a place to live, either.
- Big cities and big properties were prime “targets of opportunity;” therefore, central business districts were going to be eclipsed by less-visible suburban locations; and the Edge City would rotate forward as the preferred location for businesses.
That was the common understanding, and it was wholly illusory.
This is not just second-guessing on my part. In 2002 I did a study for the Civic Alliance for Downtown New York, that asked about the future highest-and-best use of the site at Ground Zero.15 In that study I tried to avoid speculating, to get past the raw emotions, to keep an open mind, and to stick to facts as much as possible. I thought the best guide to what business ultimately would do after 9/11 was what they actually did do when forced to make the tough choices of relocating in the one-to-six months after the tragedy.
Here’s what I found. Firms did not go far afield, didn’t go to the locations that happened to have the most space opportunistically available, and didn’t elect to hedge costs by going to the least expensive locations either. With rare exceptions, displaced companies did not move more than three miles from Ground Zero. Where they did move was to sites that had a combination of characteristics – access to the entire regional labor force (that is, locations toward the region’s center), places with a volume of modern, highly functional office space, and places with multiple options in transportation modalities. In other words, business actually sought to replicate the attributes they had first chosen when they elected to be at the World Trade Center and its environs in the first place.
My conclusions and recommendations? Lean away from the conventional understanding of Downtown New York’s inescapable demise, and rebuild commercial office buildings as the prime use at a re-developed Ground Zero.
In one sense, this approach fits Taleb’s recommendation of a skeptical, bottom-up empiricism. But in another sense it relies upon standard tools of data analysis that Taleb frequently derides in his attack on prediction. The antidote to the illusion of understanding is deeper understanding. The cure for superficial analysis can be simply better analysis.16 Our understanding of Extremistan17 – the low probability tails of distributions – can be aided by the use of the techniques of Mediocristan18, if those techniques are employed carefully and with due humility.
One final lesson from 9/11. A significant risk that occurs unexpectedly in the particular instance but that is identifiable in a more general sense, however rare it may materialize in one property or another, has a very traditional risk-management response. That response is insurance.19 And so one important and effective response to 9/11, has been the U.S. Terrorism Risk Insurance Program, first enacted and reauthorized by President George W. Bush and again reauthorized in January 2015 by President Barack Obama. Although it is not a full answer to the Illusion of Understanding – that we think we know more than we do – insurance does provide us with a mechanism for pricing and adjusting for our ignorance. It’s a Mediocristan tool useful for dealing with Extremistan Black Swan events.20
The Retrospective Distortion
Now we need to confront Nassim Taleb’s controversial position, stated in the title of Part 2 of The Black Swan, “We Just Can’t Predict,” starting with Chapter 10, “The Scandal of Prediction.” In large measure, Taleb believes this failure is less because we lack clairvoyance, than because we fall into three big errors.
First, we dismiss the role of chance in life, in business, and in the course of history.
Second, we succumb to what he calls “the narrative fallacy,” the tendency to look at events after the fact and rationalize – over simplistically – cause and effect so that we can feel comfortable that we grasp how and why things happen. We tell ourselves stories.
And then, in the third error, those stories become the basis for a mathematics that quantifies those narratives into forecasting models that extrapolate the presumed cause and effect relationships – perhaps partially correct, but only partially so – into the future, yielding predictions based upon central tendencies, usually based on regression equations, predictions that avoid extremes as a deliberate statistical strategy.
This critique is Taleb’s basis for connecting the concept of “improbable” to the concept of “unexpected.” It is extremely germane for real estate investors, who must rely on the Principle of Anticipation in evaluating and pricing individual deals and overall investment strategies.21
The Principle of Anticipation states that present value is the discounted present worth of expected future benefits – projected cash flow and project appreciation – of the assets purchased. Taleb asks us to consider if we aren’t really foolish to presume we can anticipate future performance in any meaningful sense.
Let’s use the run-up to the Global Financial Crisis as our case in point. A proposition has been advanced that “you can never tell it’s a bubble until it bursts.” That’s the narrative fallacy at work: we look backwards, and say “Aha! That’s why it happened. If we’d only known at the time!!!” Alan Greenspan, in an apologia published in Foreign Affairs magazine, confesses “I never saw it coming,” but then protects his flank by remarking that the Federal Reserve’s extremely sophisticated models didn’t predict the systemic threat – and neither did the models of the big banks like JP Morgan Chase, nor did the models maintained by the World Bank and the IMF. But rather than throw out the models – which is Taleb’s recommendation – Greenspan advocates tweaking the models, incorporating elements of behavioral economics to account for what Keynes long ago called “animal spirits”22 – the entry of fear bringing risk aversion into dominance; the claims of immediacy, short-term thinking, over long-range perspective; and the herd instinct that first drives speculation and then panic, first irrationally inflating prices, and then over-correcting that inflation on the downside. But even with a nod toward acknowledging “tail risk” (a key Black Swan argument), Greenspan concludes, “Forecasting will always be something of a coin toss.”23
That’s discouraging. Can’t we do a better job of anticipating? Shouldn’t we do a better job of anticipating? We can, and we should!
There are a number of fine books treating the kind of Black Swan that appears as an economic bubble. including a wonderful essay written in 1993 by John Kenneth Galbraith entitled, A Short History of Financial Euphoria. The examples are familiar:
- Tulipmania in the 17th century;
- the Banque Royale and and South Sea Company bubbles of the 18th century;
- the booms and busts of banking, canal and railroad stocks in the 19th century;
- and, in the 20th century, the leveraged buyout craze, junk bonds, the real estate syndication mania,, the dissolution of the savings and loan industry, the dot-com bubble and bust.
Galbraith quotes Sir Isaac Newton, who said, “I can measure the motion of heavenly bodies, but I cannot measure human folly.”24
As far as measurement goes, who knows what yardstick to use? Here’s a suggestion: the repeated pattern of bubbles reveals a set of recurrent symptoms that can help us diagnose future bubbles in time to act – not just by the retrospective rationalizations of the Narrative.
A few of those symptoms seem to be:
- The proclamation of a “new economy” with new rules, usually with new technology driving the change
- The onset of speculative fever, especially manifested by the dominance of expected future appreciation in setting market prices, as opposed to current operating profits and tangible utility in producing profitable goods and services
- Over-confidence, bred by a specious association of wealth with intelligence
- The rise of an investor generation lacking financial memory
- And the rise of leverage, excessive borrowing against weak collateral and the pursuit of one’s own financial gain by the use of “other people’s money.”
No matter what the models may suggest, the confluence of these factors – the emergence of this complex of symptoms – seems to be how a mental disease, the euphoria that inflates financial bubbles, manifests itself in advance of the collapse occurring.
Taleb cautions us that we can’t calculate our way toward the anticipation of a Black Swan. Galbraith, years earlier, showed another way – a diagnostic set of conditions – by which bubbles that may defy mathematical prediction can be practically anticipated.
I advise students to tape that list of symptoms to the side of their computers, and to consult the list before turning the output of economic projection models into business recommendations.
The Overvaluation of Information
At several spots in The Black Swan, Taleb recounts the events of October 19, 1987. That was the day that stocks dropped 22.6% in value on the New York exchange, and markets around the world followed suit. This is still the market’s largest one-day decline in percentage terms in its long history.
I also remember that day very well, since that evening I was scheduled to deliver the third in a series of talks on “financial tectonics” to a group of New York appraisers and underwriters. The conceptual image of the pressures that build up as the plates of the earth’s crust grind against each other until that energy is released in an earthquake was all too apt as an image for that day’s events.
The Narrative Fallacy certainly came into play. Analysts attributed the crash to a market spooked by the escalation of violence between Iran and the U.S., to a sense of over-valuation sparking arbitrage selling in New York against the options on the futures markets in Chicago, even a shortage of liquidity as the London markets were closed due to the Great Storm of 1987.
But most observers attributed the freefall in prices to computerized program trading, triggered by the parameters of portfolio insurance. The algorithms in the programs incorporated vast amounts of market data very quickly, and were thought to be a measure of protection in signaling reasons to “get out”, to sell, if conditions hit trigger points that had previously been associated with accumulating risk.25
The programs were supposed to use data, numbers representing real information, to mitigate risk. They exacerbated risk instead.
In an unexpectedly powerful demonstration of the Efficient Market Hypothesis, all the major traders had the same information and the same algorithms – meaning that they all sent “sell signals” at the same time. With a huge imbalance of sellers over buyers, there was only one direction for a market seeking a clearing price – down into the abyss. Only the closing bell could halt the cascade – and even then it took hours to book and register the trades that would finally tally the day’s losses.
The October 1987 Crash begs a question: at its extreme, is the Efficient Market Hypothesis rational, or is it insane?26
A Black Swan like this has, in fact, been seen several times since 1987.27 Think of the mean reversion model that ultimately sunk Long-Term Capital Management, Wall Street’s largest hedge fund, in 1998. Or the 2010 “Flash Crash,” for which high-frequency trading algorithms bore much of the blame. These are the Black Swan risks of the Big Data era, risks only made more acute by cyber-crime and even cyber-warfare.
It is not that the problem is too much information and too sophisticated math. We need both to do business in the modern world. Rather, the problem is that using algorithms as substitutes for prudent judgment, and valuing speed over careful consideration, is a mistaken approach to thoughtful decision-making.28 Data and analysis are always about the past, but good judgment is of the essence in making decisions shaping the future. A marvelous book entitled Understanding Computers and Cognition29 has a memorable line: “at the level of judgment, the finest computers fail.”
Daniel Kahneman, in Thinking Fast and Slow, describes two distinct ways of thinking in human beings.30 System One is fast and automatic, almost effortless, impressionistic, intuitive, and highly adaptive for ordinary, everyday living – and probably for ordinary, routine business management as well. System Two, on the other hand, is slower and more effortful, requiring concentration, complex integration. Critically, System Two is activated at times when events do not fit neatly into our mental models – our expectations. And, equally critically, System Two is charged with monitoring our own behaviors – the slow and deliberative activity of System Two is where self-reflection enters into our decision-making. It is where we understand not only the limits of our models, but of our own capacity.
Dare I say that this is the point where the virtue of humility – an appreciation of our capacity for potential mistakes – enters as a check against risks that are deemed improbable simply because we can’t imagine things going wrong?
Things can go wrong. Things do go wrong.
If we ignore that possibility, well, we’ve invited the Black Swans to fly into our world.
A short time ago (2015), Pope Francis canonized Fra Junipero Serro a saint of the Catholic Church. As part of the process of considering sainthood, the Vatican appoints a canon lawyer to argue the points for denying canonization.31 That lawyer is called the Devil’s Advocate. In our own business thinking, and in the collective business thinking of boards and of investment committees, a permanent Devil’s Advocate could institutionalize System Two thinking. That, I suggest, would be a step toward decreasing reliance on overly automatic and minimally reflective processes, and reducing the conditions that breed Black Swans with distressing frequency.
Despite its polemical excesses, Nassim Taleb’s work on uncertainty underscores a key requirement for real estate practitioners and the academics who analyze real estate activity, namely, the inevitable need to make decisions in real time on the basis of admittedly incomplete knowledge. That’s the human condition, hypotheses about full rationality and trust in the market’s invisible hand notwithstanding.
Taleb’s contribution in pointing out our limitations in knowledge is helpful but, in my judgment, incomplete. Careful attention to the assumptions behind our analytics can increase our recognition of risks, even risks that are surprising in their specific appearance but that can be prepared for in a more global sense. We can think more like doctors understanding the emergence of disease symptoms and practice “preventive medicine” as those symptoms come to the fore.
And we can deepen our understanding that right judgment is rarely “an inside job.” Open-mindedness to evidence, a willingness to be surprised by the facts, and (crucially) to accept the insights of others with contrary viewpoints are valuable business attributes to have, important way-finders at the fork in the road between fruitful innovation and foolish improvisation.32
4. jrdelisle.com/jrd_text/1Chapter4_New_V17_L2.pdf ↩
5. The Black Swan (Random House, 2007) is one of a series of books in Taleb’s work on uncertainty and risk, the Incerto series. The others are Fooled by Randomness (2001), The Bed of Procrustes (2010), and Antifragile: Things that Gain from Disorder (2012), all published by Random House/Penguin. ↩
6. See Chapter 15 of The Black Swan, “The Bell Curve, That Great Intellectual Fraud.” (pp. 229 – 252. One of the insightful discussions in this chapter shows the relationship of the much-used statistical measure of “Standard Deviation” to an assumed bell curve, or normal distribution of data. Well in advance of Taleb’s books, two RREEF researchers, Mike Young and Richard Graff, made the same observation in “Real Estate Is Not Normal: A Fresh Look at Real Estate Return Distributions” (Journal of Real Estate Finance and Economics, 10:3. May 1995, pp. 225.259. I have used a similar critique, and proposed an alternative approach, in “Ex post to ex ante: using some lessons from the global financial crisis to prepare for future risk,” in the Journal of Property Investment & Finance Vol. 35 No. 6, September 2017, pp. 1-15. ↩
7. Justin Fox, The Myth of the Rational Market: A History of Risk, Reward, and Delusion on Wall Street. (Harper/Business, 2009). ↩
8. Roger Lowenstein, When Genius Failed: The Rise and Fall of Long-Term Capital Management. (Random House, 2000) ↩
9. John Kenneth Galbraith, A Short History of Financial Euphoria. (Penguin, 1990). ↩
10. References are to this text when “Taleb” is mentioned, and to avoid annoying repetition page citations are passed over – especially since his basic concepts are sprinkled liberally through the book. I acknowledge that there is a blatantly polemical tone to The Black Swan and that Taleb is prone to hyperbole in making his points, some of which I disagree with. But his push-back on the prevailing orthodoxy in academia and on Wall Street is a bracing corrective, based upon his experience as a trader and wide-ranging scholarship, and well worth serious consideration. ↩
11. The Delphic Oracle proclaimed Socrates the “wisest of men”, a claim with Socrates himself set out to disprove (Plato, Apology, 20c – 24e). Karl Jaspers identifies the core of Socrates’ activity as the conviction that “Conversation, dialogue, is necessary for the truth itself… the significance of Socrates’ approach is that one must know one’s own ignorance and embark on the journey of thought.” (Jaspers, Socrates, Buddha, Confucius, Jesus: The paradigmatic individuals, from The Great Philosophers, Volume 1 ,p. 6 ). We might very much take this to heart in the 21st century when major legislation is enacted without full debate, hearings, and investigation. Jaspers’ doctoral student, Hannah Arendt (one of my own teachers), identifies the process of open deliberation as The Promise of Politics (Schocken, 2005: a collection of Arendt’s papers, posthumously published as edited and introduced by Jerome Kohn). Socrates, Arendt maintains, endeavored “to make the city (polis) more truthful by delivering each of the citizens of their truth… by talking things through (p. 15). ↩
12. The evolution of the mathematical approach to risk measurement and mitigation is masterfully presented in Peter L. Bernstein, Against the Gods: The Remarkable Story of Risk (John Wiley & Sons, 1996). ↩
13. Immanuel Kant, Critique of Pure Reason (reprinted by Cambridge University Press, 1998). ↩
14. See, for instance, Emerging Trends in Real Estate 2003, pages 10 – 13, “Terror’s Unsettling Legacy.” (PricewaterhouseCoopers/Lend Lease). ↩
15. See Regional Plan Association (2002), “Reports of the Economic Development Working Group,” Chapter IV: “The New York Regional and Downtown Office Market: History and Prospects after 9/11,” by Hugh F. Kelly. See also, http://www.academia.edu/2754377/The_aftermath_of_the_9_11_attack_in_the_New_York_City
16. This is, of course, a very Socratic position and one which is reminiscent of Supreme Court Justice Louis Brandeis’ Whitney v. California opinion (1927), “If there be time to expose through discussion the falsehood and fallacies, to avert the evil by the processes of education, the remedy to be applied is more speech, not enforced silence.” ↩
17. Defined by Taleb (Black Swan, p. 308): “the province where the total can conceivably be impacted by a single observation.” ↩
18. Defined by Taleb (Black Swan, p. 309): “the province dominated by the mediocre, with few extreme successes or failures. No single observation can meaningfully impact the aggregate. The bell curve is grounded in Mediocristan.” ↩
19. See Bernstein, op. cit., especially chapters 5 and 12. ↩
20. In recognition of the same problem that Taleb underscores, Bernstein (p. 202) acknowledges the limitations of our knowledge based on sampling:
“The information you have is not the information you want.
The information you want is not the information you need.
The information you need is not the information you can obtain.
The information you can obtain costs more than you want to pay.”
21. See, for instance, The Appraisal of Real Estate (11th ed.), (Appraisal Institute, 1996), p. 35. ↩
22. In his masterwork, The General Theory of Employment, Interest, and Money, (Macmillan, 1936), pp. 161-162. ↩
23. Alan Greenspan, “Never Saw It Coming: Why the Financial Crisis Took Economists by Surprise,” Foreign Affairs (November/December, 2013), pp. 88 – 96. ↩
24. Attributed to Newton in response to a question about the South Sea Bubble, see “Mammon and the Money Market,” in The Church of England Quarterly Review (1850), p. 142. ↩
25. Explanations have tended to coalesce around the technical issues of program trading in the three decades since the event, but the search for reasons in the immediate aftermath cast the net of blame widely – and tellingly – beyond Wall Street itself, into the realms of geo-politics and even weather. Financial orthodoxy, like most orthodoxies, is very efficiently self-protecting of its assumptions. ↩
26. This debate continues to rage amongst economists. Burton Malkiel, author of A Random Walk Down Wall Street (Norton, 1973) defends the conventional position in “The Efficient Market Hypothesis and its Critics” (Journal of Economic Perspectives, V. 17, N.1 (Winter 2003), pp 59 – 82. As long ago as the 1960s economist Hyman Minsky stressed that markets can be efficient periodically, but can – under the influence of profit-maximizing pressures – then enter into episodes of great inefficiency and even market failure. Galbraith, op.cit., regards such episodes as endemic in the system. At the extreme, we find Imad A. Moosa, Contemporary Issues in the Post-Crisis Regulatory Landscape, Chapter 4: “The Efficient Market Hypothesis as a Weapon of Mass Destruction” (World Scientific Publishing, 2016), pp. 107-128. ↩
27. It is worth noting that Galbraith predicted as much. Op. cit., p. 110. ↩
28. Thus there is a significant danger to the recurrent proposals by those following John Taylor of the Hoover Institute that the Federal Reserve adhere to a strict mathematical formula for setting interest rate policy along the lines of the “Taylor Rule.” Advocates for this position think along the lines found in http://www.heritage.org/monetary-policy/report/why-congress-should-institute-rules-based-monetary-policy
Others, including myself, think “the human factor” is a vital input; see, https://mises.org/library/problem-rules-based-monetary-policy ↩
29. Terry Winograd and Fernando Flores, Understanding Computers and Cognition: A New Foundation for Design, (Addison-Wesley, 1987). ↩
30. Daniel Kahneman, Thinking Fast and Slow (Farrar, Straus & Girous, 2011). See also on this subject, Richard Thaler, Misbehaving: The Making of Behavioral Economics (W.W. Norton & Co., 2015). ↩
31. This designation was unusually controversial, as supporters stressed the Franciscan’s missionary and evangelization zeal and his personal sacrifices, while opponents decried his role in the service of Spanish imperialism in suppression the native populations in California, including slavery and inhumane treatment. The Devil’s Advocate in this case had some evidence to work with. ↩
32. Discussed in greater detail in Hugh F. Kelly, “Judgment: Imagination, Creativity, and Delusion,” in Existenz: An International Journal in Philosophy, Religion, Politics, and the Arts, V. 3, N.1 (Spring 2008), pp. 58-69. Downloadable at http://www.bu.edu/paideia/existenz ↩