Algorithmic trading is growing much and that raises some concerns. What is your analysis ?
On the London Stock Exchange, algorithmic trading now covers 60 to 70% orders sent every day. This is indeed a strong increase, but this an arithmetic progression. Moreover, the relevant fact comes from the trading volumes, in the event the increase mainly concerns the number of orders sent into the stock market, and with 20 orders, 19 are not executed; without forgetting that these orders tend to be small in terms of size, it is rarely large blocks. But yes, in general, much of which is trading today on the stock market goes through algorithms.
Observers believe that this can only encourage fraud...
I do not worry more than that, about the risk of fraud. For one simple reason: to make algorithmic trading on a large scale, and thus hold a serious fraud, it requires substantial computers resources, resources that are reserved for large banks. And large banks will be the last to do such fraud. Credit Suisse, for example, trying to manipulate markets to benefit its own traders, it is a myth, it does not exist, no one is doing that.
Where do these fears come from if they are unfounded ?
Let’s say that there’s a whole context, with independent brokers that it suits well to tell that banks act in a biased manner, they will not necessarily serve their customers with all the necessary integrity. But in reality, it does not happen like that. For ten years I worked in London, I worked with almost all banks in the City, either 50 or 60 financial institutions, and in practice, I can say that professionals do not cheat, at least I have not seen it.
The other concern related to the success of automated trading is that of a computer bug, like those experienced in recent years on the London Stock exchange...
Algorithmic trading is not something new, it existed a decade ago. But it is true that the proportion has steadily increased over the past ten years, and that the incidents have multiplied over the past two years. This is due to two factors: the first is the strong competition between exchanges, which was not the case before. The London Stock Exchange or Euronext have had to deal with alternative operators such as Turquoise, Chi-X or OMX. the second factor is the "arms race" for the reduction of latency. with the algorithms, the liquidity is less and less apparent, we see only the surface in the markets, and everything depends on speed when information is released on one or multiple stock exchanges simultaneously.
How have exchange platforms reacted ?
To be competitive, they had to change their infrastructure, allowing trading companies to be physically hosted on their premises, removing all the layers - network, hardware and sometimes software - that existed between the members and central negotiation. These layers made it impossible for a single member to crash a stock exchange. It was like a funnel : only a limited amount could come out in the end.
but the limited side of this funnel, which slows the stock exchange performance, was removed to enable exchanges to upgrade to the competition. The risks associated with algorithmic trading are not that much related to fraud or bug issues. As often happens, the market has advanced faster than the regulator, and platforms have been caught between two imperatives : promoting access to their quotation engine by inviting a maximum of traders and that of being able to absorb all the outstanding orders. the solution: build more powerful funnel systems, with cut-out in case of excessive volumes. But this is tricky: the LSE may impose its own limits, but the risk would be to see the volume migrate to other platforms