The Bank of England must deepen its understanding of the risks of increasingly fast algorithmic trading, and market participants should make sure that they have proper safeguards in place to cope with the speed at which these markets are capable of moving, according to Chris Salmon, the Bank's executive director for markets.
Algorithmic trading relies on a set of instructions for computer systems to follow, but has been blamed for causing volatile swings in the market. These "flash crashes" include the 2010 U.S. equities crash and the sudden drop in the pound against the dollar in October 2016, and it is essential that the Bank keeps pace with ways in which the high-frequency trading market is evolving, Salmon said during an Oct. 6 speech at the 13th Annual Central Bank Conference on the Microstructure of Financial Markets in London.
"Of course, sharp movements in asset prices are nothing new in themselves. Just look at the tulip mania ending in 1637, and the equity crashes of 1929, 1987, and the bursting of the dot-com bubble in 2000," Salmon said. "What is new is the speed, and the typical near-total reversal. And the Bank’s responsibilities for monetary and financial stability make it incumbent on us to keep up."
The use of algorithmic trading does not change any of the "fundamental" risks in trading in financial markets, but it does intensify them, he said. It also increases the potential for traders to build up large intraday positions, particularly banks and nonbank intermediaries that specialize in high-frequency trading.
Although banks and other intermediaries that participate in high-frequency trading should be "expert" in managing the risks of "fast markets," this is not always the case, Salmon said, pointing to market-making and equity trading firm Knight Capital, which lost $460 million thanks to a manual error in 2012.
"The examples of losses by specialist firms in periods of fast market turbulence cautions against assuming that all end-users will effectively manage the new risks associated with the use of algorithms," he said.
Some 21% of institutional asset managers and 10% of corporates use automated algorithmic trading for foreign exchange, Salmon said, citing recent research from Greenwich Associates.
Improved liquidity, but perhaps not efficiency
There is "much evidence" that automated high-frequency trading has improved liquidity in financial markets, Salmon said, adding, however, that it has not necessarily led to greater efficiency. An increasing number of market participants are trading outside traditional exchanges to protect information that could signal their intentions and leave them at a disadvantage, he observed.
"In equity markets, for example, there has been a shift away from trading on 'lit' exchanges. The market has moved more towards 'off-book' trading, where participants transact bilaterally, seeking not to reveal information on their trading intentions, and to dark venues, such as broker-crossing networks and some multi-lateral trading facilities," he said.
Similarly, FX trading is moving toward trading arrangements with a smaller range of counterparties and less visibility.
Although these behaviors are "rational," over time they can reduce transparency, which can make price discovery harder for everyone in the market, Salmon said. Furthermore, the prevalence of flash crashes shows that although high-frequency trading generally helps to set prices efficiently, this is not always the case.
While there is no "risk flashing red" for the Bank of England around high-frequency trading, it is vital that authorities remain "vigilant," he said.