The high frequency trade off between speed and sophistication

https://doi.org/10.1016/j.jedc.2020.103912Get rights and content

Abstract

Central to the ability of a high frequency trader to make money is speed. In order to be first to trading opportunities, firms invest in the fastest hardware and the shortest connections between their machines and the markets. Yet this is not enough: algorithms must be short, no more than a few instructions. As a result there is a trade-off in the design of optimal high frequency trading strategies: being the fastest necessitates being less sophisticated. To understand the effect of this tension a computational model is presented that captures latency, both of code execution and information transmission. Trading algorithms are modelled through genetic programming with longer programmes allowing more sophisticated decisions at the cost of slower execution times. It is shown that, depending on the market composition, short fast strategies and slower more sophisticated strategies may both be viable and exploit different trading opportunities. The relative profits of these different approaches vary, however, slow traders benefit and social welfare increase in the presence of HFTs. A suite of regulations are tested to manage the risks associated with high frequency trading, the majority are found to be ineffective, though constraining the ratio of orders to trades may be promising.

Introduction

The phrase ‘time is money’ neatly captures the business model of high frequency traders (HFTs). These traders make money by being the first, whether that’s the first to trade against incoming orders or the first to revise stale quotes in the event of changing market conditions. Slower algorithms miss out on the best opportunities and bear more risk. As a result there is an arms race amongst HFT firms to create algorithms that can identify mispricings and execute new orders in the fastest time. The time to execute an order depends on several factors: the time taken for signals to travel between the exchange and the HFT computers, the computer hardware, and the algorithms that run on them. HFT firms pay to minimise the first two factors and not be at a disadvantage to their competitors. This includes co-locating of their hardware with that of the exchange1 and purchasing the fastest and most up to date computer hardware. The final component, the algorithms that govern trade, are the source of competitive advantage for HFT firms but also represents a trade-off. Longer trading algorithms take more computational cycles to execute and therefore result in slower actions. By reducing the length of an algorithm the HFT firm makes their order more likely to be first. Shortening algorithms, however, has a consequence - reducing the number of instructions reduces the information processing capacity of the algorithm - reducing the algorithm’s ability to identify profitable opportunities and avoid losses. This is a fundamental trade-off in the design of HFT algorithms. The shorter the algorithm the more likely it is to be first to act but the less sophisticated its strategy.

It is this trade-off that will be the basis of the investigation in this paper. A model is constructed of the behaviour of HFT algorithms subject to a speed/sophistication trade-off. HFT traders endogenously decide when to trade based on the actions of others and information arriving at the market. I analyse the problem faced by HFT strategy designers - what is the optimal strategy when speed may be traded-off against information processing ability? Whilst highlighted in the media (O’Brien, 2014) and by professionals (Sapir, 2019) this issue has not previously been considered in the scientific literature. Multiple papers have looked at trading speed in the face of technological costs (see for instance Biais et al., 2015 or Delaney, 2018), however, they have done so in an environment of perfect rationality. Similarly papers such as Huang and Yueshen (2018) and Bernales (2019) examine information acquisition and speed acquisition as separate dimensions. Here I make a new argument that these two are linked - the cost of speed is not just monetary but also in terms of cognitive (computational) sophistication, i.e. in order to increase speed, information processing capacity and therefore perfect rationality has to be sacrificed. The extent to which traders are willing to do this is not clear. For instance it is not inevitable that algorithms will be ever simpler and faster. Unsophisticated trading algorithms may leave money on the table that slower and more sophisticated HFTs may identify and capture.

In order to investigate this question it is necessary to have a representation of strategies where computational sophistication is related to time in a realistic manner. The natural choice for this is to use a computational approach in which high frequency traders are, like in real life, algorithms. Longer algorithms, as measured by the number of instructions, generally take more time to execute as the computer processor must step through each instruction in turn.2 The difficulty of this approach is then specifying the optimal trading algorithm(s). The relationship between code length, algorithm design and performance is complex and non-tractable. To resolve this I optimise the trading algorithms by competing them against each other within a market. This process maintains those that do well whilst continuously looking for modifications and improvements that will enhance performance. The approach of optimising programmes (as opposed to parameters) is referred to as genetic programming - essentially evolving algorithms to solve a task.3 It is particularly appropriate for this case as the structure of the algorithm itself is subject to modification - it may be made longer or shorter and the instructions within it changed. Other evolutionary techniques such as genetic algorithms work by optimising parameters within a specified algorithm which in this case would omit one of the key areas of potential differentiation between traders.

For this problem genetic programming creates an attractive analogy: a market of trading algorithms competing to make profits based on the speed and sophistication of their chosen algorithm with the most successful surviving and the losses being replaced. Given sufficient time this process will lead to the identification of a steady state in which important details of trading algorithms and market behaviour no longer change. It is this state, rather than the optimisation process which produces it, that I will analyse. Genetic programming has been used to simulate trading strategies of different sophistications before. Yeh (2008) uses a genetic programming model to show that greater intelligence improves market efficiency. While Ladley et al. (2015) uses a genetic programming model to investigate the relationship between skill and market fragmentation and show that large numbers of unskilled individuals make the market more susceptible to shocks. Manahov et al. (2014) show that varying the length of trading strategies impacts trader and market performance. Importantly, however, no work has looked at the trade-off between speed and sophistication.

Using the model, I show that despite competition to be fastest and therefore first to arrive at a trading opportunity, not all HFTs take this approach in equilibrium. Whilst increasing competition between HFTs does push the population towards shorter strategies this is not universal. Some traders adopt longer strategies and are shown to generate greater per order (and per trade) profits than those adopting the fastest algorithms. These traders identify and exploit trading opportunities that their faster competitors miss. As such there are multiple equilibria in the design of HFT trading strategies.

The presence of HFTs within the market is shown to be generally positive. Increasing numbers of HFTs increase social welfare, market quality through higher liquidity and lower pricing errors. At the same time they have relatively little effect on the overall profits of slow traders, instead their profits come through improved prices and reduced waiting costs for trade. The greater competition coming from higher numbers of HFTs leads to reducing profits for this group and other liquidity provides but no negative effects for other slower traders.

The effectiveness of a suite of regulations proposed to manage the impact of high frequency traders is considered. The majority, including minimum resting times, speed bumps and increased tick sizes are found to have little or no positive effect, often damaging market quality and increasing the returns of HFTs at the expense of slower traders. Transaction taxes are found to be particularly detrimental to both the market and traders. The only regulation which potentially has a positive effect are constraints on the ratio of trades to orders submitted by the HFTs. This regulation increases social welfare and the profits of slow traders relative to HFTs but at the expense of reducing market liquidity.

The remainder of the paper is organised as follows. Section 2 considers previous work looking independently at trading speed and the effect of cognitive ability on market performance together with HFT regulation. Section 3 presents a model of a market in which traders trade-off speed and strategic sophistication. Section 4 presents results showing optimal trading strategies and market quality whilst Section 5 looks at the effectiveness of regulations. Section 6 concludes.

Section snippets

Related literature

The trade-off between speed and computational time has not previously been considered in relation to trading. There have, however, been pieces of work that have looked at each factor individually. The link between intelligence and financial success has been studied extensively. Some of the earliest work looked at the connection between wages and intelligence (see for instance Moore, 1911). More recently papers have considered the link between cognitive ability and stock market performance.

Model

I consider a continuous time model of the trade of a single financial asset. The asset is traded through an order book which is defined in the standard manner. It consists of a discrete series of equidistantly spaced prices at which orders may be submitted. The set of prices is Π={pi}i= whilst the distance between adjacent prices is referred to as the tick size and is equal to δp. Each price has an associated queue of unfilled orders, potentially of size zero, available at that price, lti.

Results

The results of the model are presented in two parts. The first focuses on the strategies and profitability of HFTs in the market before considering their effect on market quality. The second group of results examines the effect of a suite of regulations on market quality and the behaviour of traders.

Regulation

There has been much debate about the regulation of financial markets in the presence of HFTs. In particular HFTs have been accused of damaging market quality and inhibiting trading opportunities for other traders. The results above suggest that whilst HFT may narrow the spread they have negative effects on available quantities. At the same time there may be over investment in HFT technologies damaging social welfare. As a result, a suite of regulations have been proposed by regulators and

Conclusion

The theoretical analysis of high frequency trading has generally focused on optimal behaviour, and so has ignored the trade-off between sophisticated decision making and time. I present a model of the interaction of high frequency traders in which trading speed is dictated by strategy sophistication and the choice of strategy sophistication is endogenous. Using this model I show that there are multiple equilibria in strategy design. Despite competition to be first by being quickest, traders

References (45)

  • A. Kirilenko et al.

    The flash crash: high-frequency trading in an electronic market

    J Finance

    (2017)
  • T. Lensberg et al.

    Costs and benefits of financial regulation: short-selling bans and transaction taxes

    Journal of Banking and Finance

    (2015)
  • V. Manahov et al.

    The implications of trader cognitive abilities on stock market properties

    Intelligent Systems in Accounting, Finance and Management

    (2014)
  • A. Menkveld et al.

    Need for Speed? Exchange Latency and Liquidity

    Post-Print

    (2017)
  • H.L. Moore

    Laws of wages; an essay in statistical economics

    (1911)
  • J. Paulin et al.

    Understanding flash crash contagion and systemic risk: a micro-macro agent-based approach

    Journal of Economic Dynamics and Control

    (2019)
  • N. Sapir

    High-Frequency Trading and Ultra Low Latency Development Techniques

    Technical Report

    (2019)
  • A.J. Turner et al.

    Introducing a cross platform open source cartesian genetic programming library

    The Journal of Genetic Programming and Evolvable Machines

    (2014)
  • J. Brogaard et al.

    Trading fast and slow: colocation and liquidity

    Rev Financ Stud

    (2015)
  • J. Brogaard et al.

    High-frequency trading and price discovery

    Rev Financ Stud

    (2014)
  • E. Budish et al.

    The high-Frequency trading arms race: frequent batch auctions as a market design response

    Q J Econ

    (2015)
  • S.V. Burks et al.

    Cognitive skills affect economic preferences, strategic behavior, and job attachment

    Proceedings of the National Academy of Sciences

    (2009)
  • Cited by (9)

    • Reinforcement Learning Equilibrium in Limit Order Markets

      2022, Journal of Economic Dynamics and Control
    • Machine learning and speed in high-frequency trading

      2022, Journal of Economic Dynamics and Control
    View all citing articles on Scopus

    I thank the British Academy and Leverhulme Trust for their support through the Small Grant scheme. In addition I thank the editor, two anonymous referees and seminar participants at Computing in Economics and Finance, 2017, International Finance and Banking Association, 2017, King’s College London and the University of Leicester.

    View full text