"The stock market today is a war zone, where algobots fight each other over pennies, millions of times a second...inevitably, at some point in the future, significant losses will end up being borne by investors with no direct connection to the HFT world, which is so complex that its potential systemic repercussions are literally unknowable." Felix Salmon
I've written about algorithms before. I think it's inevitable that the trading of media space will become ever more automated. Price customisation software will play an ever bigger role in the optimisation of pricing. And I think algorithms are fundamental to the future of content. But what about when they go wrong?
A fortnight ago, a computerised trading programme belonging to leading US stock broker Knight Capital Group 'ran amok', with staffers at Knight unable to stop it trading for more than half an hour. The result was a near fatal $440 million loss, the company kept alive only by emergency financing, and now in a position where it is likely to have to sell off parts of its business to keep going.
On May 6th 2010 in the so-called Flash Crash, algorithmic trading contributed to the second largest point swing and the biggest one-day point decline in the history of the Dow Jones Industrial Average, as it lost about 9% of its value, only to recover those losses within a matter of minutes. A joint report by the U.S. Securities and Exchange Commission and the Commodity Futures Trading Commission identified how an unusually large sell-off of E-Mini S&P 500 contracts by a large mutual fund firm had initially exhausted available buyers, but then off the back of that how high-frequency algorithmic traders had then started aggressively selling, accelerating the effect of the mutual fund's selling. The report portrayed "a market so fragmented and fragile that a single large trade could send stocks into a sudden spiral."
Such algorithmic trading is more common than you might think. As of 2009, High Frequency Trading (HFT) firms accounted for 73% of all US equity trading volume. HFT uses algorithms to make highly complex decisions at lightening speed before human traders are capable of processing the same information. Automated trades are used on the buy side (by pension funds and mutual funds for example) to sub-divide large trades to minimise market impact and risk (and in some sense hide what they're doing), and on the sell side (by so-called market makers and hedge funds) to provide liquidity to the market. Many however, have questioned the value of that liquidity saying that it "has a rather ghostly quality and tends to vanish when needed most".
The animated GIF above shows the amount of high-frequency trading in the stock market from January 2007 to January 2012. It shows not only the rise in HFT over that time but a world that, as Felix Salmon of Reuters noted, "in aggregate seemingly has a mind of its own when it comes to trading patterns". The stock market, says Salmon, is clearly more dangerous than it was in 2007 incorporating a much greater tail risk and yet in return for facing that danger, "society as a whole has received precious little utility".
Automation and algorithms are changing the structure of our markets. That much is perhaps inevitable. But is it right that in the quest for speed and frequency we are building the kind of systemic risk whose scale may be unknown but which could well impact far outside the domain of the financial markets? Personally, I think not.
HT to @BBHLabs for the Felix Salmon link