Predicting the future is difficult—just ask George Soros. While Soros is often celebrated for the $1 billion profit he made in 1992 on a bet that the pound sterling would collapse in valuation, other trades ended up costing him almost as much money as he made.
Predicting the future is difficult—just ask George Soros. While Soros is often celebrated for the $1 billion profit he made in 1992 on a bet that the pound sterling would collapse in valuation, other trades ended up costing him almost as much money as he made.
As detailed in Sebastian Mallaby’s “More Money than God”, leading up to October 19, 1987, Soros’ Quantum fund had been up 60%. However, when “Black Monday” hit and the Dow Jones lost 22.6% of its value, Soros was in the middle of the mess deciding whether to sell or buy. While he held his positions through Wednesday of that week, on Thursday he abruptly changed his mind and sold positions worth $1 billion. Soros’ decision to unload his massive portfolio sparked other traders to also sell stocks and bonds, thus causing a downward spiral in markets. At the end of the day, Soros was out of the market; however his Quantum fund lost $840 million!
Alas, that’s the problem with gut decision making, you say. Soros should have used quantitative analysis, right? Even quantitative analysis can produce the wrong outcome.
Editor’s note: Paul Barsch is an employee of Teradata. Teradata is a sponsor of The Smart Data Collective.
According to Roger Lowenstein’s “When Genius Failed”, hedge fund Long Term Capital Management (LTCM) was chock full of the best minds in finance. Assembling PhDs in finance, mathematics, economics and more, LTCM partners built sophisticated trading models based on the assumption that while investors sometimes panic or get too optimistic, eventually markets settle towards equilibrium. And in moments of panic or too much optimism, LTCM’s partners believed there was money to be made.
Unfortunately, LTCM is a case study in over reliance on analytical models for decision making. Lowenstein writes, “LTCM Partners believed that all else being equal, the future would look like the past” and this—of course—turned out to be a calamitous assumption. LTCM bet heavily on models, often doubling down on investments that they believed had an infinitesimal probability of failing. The assumption underpinning these models was that markets are efficient and rational. And when markets proved otherwise, Lowenstein notes, “The fund with the highest IQs lost 77% of its capital, while the ordinary stock investor doubled his money during the same period.”
It is apparent in studying Soros and LTCM, that even the most experienced minds supplemented by analytical tools and techniques can make extremely poor decisions about the future. So why is predicting the future so difficult?
In Scientific American, author Michael Shermer has an answer. He says that the world is a “messy, complex and contingent place with countless intervening variables and confounding factors which our brains are not equipped to evaluate.” He says we should stick to short term predictions rather than those longer term trends which we so often get wrong.
Does this mean that any attempt to predict the future is for naught? Of course not, as there are definitely limited applications for prediction models in preventing fraud, recommending products, discerning customer defections, and more. Even three day weather forecasts are more right than wrong!
The real lesson is that predicting the future is hard, especially when we’re confronted with millions of potential variables. Deciding which variables to pay attention to, and weighting the importance of those variables is especially difficult.
And maybe then, the solution is not so much to spend countless hours predicting the future (especially for strategic decisions), but instead to expend energy preparing for it.
Question:
- Do you agree with Michael Shermer that our brains are not equipped to evaluate our “messy world”?
- What other case studies have you encountered where either gut or analytical decision making was taken to the extreme?