Technology is often hailed as innovation vehicle, productivity booster, and enabler of a higher standard of living for all global citizens. However, the field of finance provides an interesting backdrop for what happens when an industry is pushed to its technological limits in the pursuit of automation and speed.
Technology is often hailed as innovation vehicle, productivity booster, and enabler of a higher standard of living for all global citizens. However, the field of finance provides an interesting backdrop for what happens when an industry is pushed to its technological limits in the pursuit of automation and speed.
Since advent of the telegraph, and all the way until early 1970s, stock prices were displayed on a ticker tape printed in near real time. The ticker tape (via telegraph technology) was a drastic improvement in delivery of information, since brokers could gain stock prices with only a 15-20 minute delay from original quotation.
Setting the dial now to the year 2011, we now see super computers trading stocks—not with humans—but, with other super computers. Forget delays in minutes or seconds, today’s super computers trade in microseconds and are increasingly “co-located” near stock exchange servers to reduce the roundtrip time for electrons passing through networks. In fact, on most trading floors, human brokers are obsolete as algorithms are now programmed with decision logic to make financial instrument trades at near light speed.
We’ve come a long way since the decades of ticker tape, says Andrew Lo, professor at Massachusetts Institute of Technology (MIT). At a recent conference Professor Lo mentioned while technology has opened markets to the masses (i.e. day-trading platforms) and reduced price spreads, there are also downsides to automation and speed.
First, he says, there is the removal of the human element in decision making. As super computers trade with each other in near light speed, there are smaller and smaller windows of latency (between event and action) and therefore fewer opportunities for human intervention to correct activities of rogue algorithms or accidental “fat finger” trades.
Second, with fiber optic networks spanning ocean floors and super computers connecting global investors and markets, we’ve essentially taken a fragile system based on leverage and made it more complex. Automating and adding speed to an already “fragile” system generally isn’t a recipe for success (i.e. the May 6, 2010 Flash Crash).
Based on these trends, it’s easy to imagine a world where financial networks will intensify in complexity, capital will zip across the globe even faster, and relationships between market participants will increasingly grow more interconnected. Where loose correlations once existed between participants and events, markets will soon move in lockstep in a tightly coupled system.
To be sure, the confluence of technology and finance has been a boon to society in many respects. However, as Lo says, there are “unintended consequences” in the application of the most advanced and fastest technologies to an already fragile system. Whereas the buffer of “time” to fix mistakes before or even as they occur once existed, now we’re left to clean up the mess after disaster strikes.
In addition, as markets become more tightly coupled and complex, the butterfly effect is more pronounced where the strangest and smallest event in a far away locale can potentially cause a global market meltdown.
Paul Barsch works for Teradata, sponsor of Smart Data Collective.