821
Views
1
CrossRef citations to date
0
Altmetric
Book review

High-frequency Trading

A good place to start the review of High-Frequency Trading are the following two quotes, both from Prof. MacKenzie’s (Citation2014) excellent paper on the topic:

Interviewee AD1:

Some companies don’t wait for the exchange to tell them what’s trading.

Author:

Oh, so how do you manage to…?

Interviewee AD1:

That I can’t … I mean not only would I lose my job, I might lose my legs too!Footnote1

More succinctly and less dramatically, but just as eloquently, Durbin (Citation2010)Footnote2 puts it this way: ‘[i]n this business, everyone knows that loose lips get pink slips’.Footnote3 So, how does one write an interesting book on high-frequency trading, when limbs, if not life, seem to be at stake?

Easley, De Prado and O’Hara have taken on this perilous task, and, given the constraints, have carried it out very well. At 228 pages of wide-formatted text, the book is slim and terse, but it will give the interested reader a useful high-level understanding of what high-frequency trading is all about, and a quantitative description of some important topics. It will be of great assistance to, among others, those regulators who want to get their heads around the controversy that the publication of Lewis (Citation2014) Flash Boys has recently unleashed. Indeed, one of the greatest merits of the book under review is that it is not a breathless account of the brave new trading world with plentiful liquidity and wafer-thin margins that will simultaneously benefit hedge funds and widows and orphans—rather, it presents a variety of different, useful and plausible perspectives on the topic. The book under review does not attempt either a reconciliation of these often discordant views—and, when it comes to highlighting the possible negative aspects of high-frequency trading, it rather tends to pull its punches.Footnote4 This could be a defect, but could also be seen as a positive feature; in that the book provides the reader with enough information to draw her own conclusions.

In order to understand why high-frequency trading should be of relevance not just to super-specialized trader and high-end computer programmers, but to the wider financial community (and, indeed to society at large) it pays to go back to basics. In the old, low-frequency trading days, liquidity was typically supplied by market makers. These market participants did not try to make a profit by taking directional views on the market. Instead, they kept their inventory sufficiently well stocked to satisfy buying requests, and light enough to accommodate the stock delivered to them by sellers. They did so, because they tried to make money by capturing the bid-offer spread. By offering their service in all market conditions, they provided liquidity to the market. Historically, the importance of this liquidity-provision function in particularly important markets was recognized not just by the market participants, but also by the regulators, who would grant market makers special rights or exemptions. In exchange, market makers would stand committed to making a two-way market in reasonable size in all market conditions.

From the market-maker’s perspective, the ideal counterparty was (and is) the uninformed ‘noise’ trader, who transacts for reasons other than knowledge of changing fundamentals. To the extent that the orders from noise traders are uncorrelated, a streak of same-sign (say, ‘sell’) orders is a nuisance for the market maker, but ultimately only taxes her patience (and, possibly, her capitalFootnote5), as the ‘buy’ orders will soon arrive and bring her inventory to the balanced state she prefers. The nemesis of the market maker is the informed trader, who places orders while in possession of better information than the market makers. If fundamentals have, say, deteriorated, and the ‘sell’ orders came from informed traders, there will be no eventual countervailing buying demand at the end of the selling streak, and, by the time the market settles at the new (and lower) equilibrium price, the market maker will be nursing a loss. So, given two identical sets of same-sign orders, it is imperative for the market maker to recognize the trading footprint of an informed trader. In any case, a long series of same-sign trades put the market maker under stress, and inflict losses on her portfolio.

In the pre-high-frequency-trading days, this state of affairs assumed the form of an implicit covenant or social contract: I, the market maker, will capture the bid-offer spread and make good money in normal market conditions from uninformed traders, but in exchange I will accept the losses incurred when dealing with informed traders and I will be there to provide liquidity (and shoulder the greater losses) when the going gets rough and the market becomes choppy.Footnote6

Let’s look at this setting from the opposite perspective, i.e. the point of view of the (‘informed’) institutional investor who has to execute a large buying or selling programme.Footnote7 Clearly, if the institutional footprint is readily detected, the market maker will readily react by defensively adjusting her bid or offer levels, and the order will quickly ‘move the market’—adversely so, from the perspective of the institutional investor. So, the large informed trader will attempt to disguise the size and sign of her buying or selling programme for as long as possible (for instance, the orders may be placed at those times of the day when the volume is highest).Footnote8 Of course, waiting for an opportune time in the trading session, or chopping up a large order into digestible bites reduces the market impact (a ‘drift’ term), but, by delaying the completion, increases the uncertainty about the evolution of the ‘theoretical price’Footnote9 (a ‘square-root’ term). The trade-off between immediacy of execution (with a certain large negative price impact) and price uncertainty traces a CAPM-like ‘efficient frontier’. In this case, however, the ‘expected return’ and the ‘volatility’ are not exogenous to the problem, but at least partly depend on the actions of the investor, on the countermeasures of the market maker, on the second-order defenses deployed by the investor, etc.

While to a large extent still valid today, the account presented so far better describes the state of play in the old, low-frequency world. A series of institutional, technological and regulatory changes have altered substantially this picture. In particular, regulatory and legislative changes (MiFid in Europe and the Securities Amendment Act in the USA) were enacted to enhance completion and provide ultimate users with better execution. In the States, one of the tools to achieve these goals was the encouragement of the linking of share trading venues. The story is complex, but it is clear how the establishment of trading venues, the communication of trades and the speed of this communication progressively became more and more important.Footnote10

This state of affairs provided the ideal habitat for the evolution, alongside the old-style market makers (the ‘herbivores’ always wary of becoming the victims of adverse selection), of a new breed of market operators. These were definitely lean, mean and nimble ‘carnivores’, thinly capitalized players, that is, who would preferentially ‘take’ rather than ‘provide’ liquidity,Footnote11 and who could opportunistically wedge themselves within the interstices of the uneasy but ultimately symbiotic relationship between the old, all-weather liquidity providers and the ultimate investors. Predators ‘constitute a very distinct species of informed traders, because of the nature of the information and the frequency of their actions. [Predators] exploit a microstructural opportunity in a way similar to that in which large speculators exploit a macroeconomic inconsistency. Rather than possessing exogenous information yet to be incorporated in the market price, ‘they know that their endogenous actions are likely to trigger a microstructure mechanism, with a foreseeable outcome. Their advent has transformed liquidity provision into a tactical game.’Footnote12

This thumb-nail description already conveys some of the important messages of the book, and of Chapter 1, in particular. First, that high-frequency trading is best described as a strategic interaction, to which the conceptual tools borrowed from game theory can be most profitably applied. (The strategic nature of high-frequency trading is repeatedly mentioned in the book, but poorly developed: this is a pity.)

Second, that high-frequency trading is not ‘old trading on steroids’, but redefines the concept of informed trader, who no longer is the (typically institutional) investor who possesses better information about fundamentals. This is because ‘[high-frequency-trading] information relates to the trading process and not to the asset in itself. At longer time horizons, fundamental information predominates in determining asset prices, but in the very short term it is trading information that matters.’Footnote13

Third, that high-frequency trading has to be seen together with matching engines, trading venues and the attending regulations. More precisely, by strategically placing or removing orders, high-frequency-trading and execution algorithms directly interact with the matching engine, but, through the latter, they also indirectly interact with the trading venue. Each of these facets therefore constitutes a domain ‘in which humans write algorithms and algorithms augment and diminish human capabilities, replace humans, and sometimes confound their plans’.Footnote14 So, for instance, when a mini-flash-crash occurs,Footnote15 it is glib to explain it away just by pointing to thick-fingered human who typed a few zeros too many when entering her order. One should also consider who in the new high-frequency environment provides liquidity, what the incentives are for these liquidity providers to remain in the game when the going gets tough, how strongly capitalized they are required to be, etc. If one wants to understand the resilience of the whole ‘ecology’, one has to look at the matching engines, the trading venues and the predatory algorithms, and the execution algorithms that try to elude their detection, the regulations around the lot, and how all these disparate components interact with each other.Footnote16

The very simplified sketch provided above clearly shows that what matters for the high-frequency actors is not calendar time, but how quickly their inventories become filled or depleted, i.e. ‘volume time’. The concept is natural enough, and, in itself not particularly new.Footnote17 The original contribution by the editors of the book under review (Chapter 1) is to be found in two insights they have provided in a series of papers which have appeared over several years: first, on how to use the volume clock concept to compute the Volume-Synchronized Probability of Informed Trading (VPIN)—this is related to the accumulation of imbalanced orders; second, on the usefulness of VPIN as a leading indicator of ‘market toxicity’ and market instability. Given these two features, VPIN has become one of the most important metrics in the inventory control of a high-frequency trader. The concept is simple enough in theory, but the reader can get a glimpse of the difficulties in its practical computation in a terabyte world by reading the very interesting Chapter 6. The editors have done very interesting and extensive work in the VPIN area and, in the opinion of this reviewer, they could have been less modest and adapted and included more of their numerous papers in the book.

As one moves from this 30,000-feet view to a more granular analysis, the most remarkable feature is how different high-frequency-trading strategies become in different markets: so, ‘[s]trategies that are optimal in short-term futures, for example, are very different from strategies that are successfully employed in equity markets’.Footnote18 It is for this reason that the book under review helpfully looks in turn at the equity, fixed income and FX markets in Chapters 2–4. (Chapter 3, by Robert Almgren, one of the leading researchers in the field, is one of the is the most interesting of this batch, especially because of the challenges posed by the pronounced co-integration of the fixed-income markets; Chapter 4 the weakest—see my comments towards the end of this review.)

Chapter 5 deals with an important aspect of the new trading universe, i.e. how machines adaptively learn about market microstructure and trading patterns (momentum in particular is looked at in some detail). Among other interesting insights, it provides a list of entry-level defensive actions an institutional investor may consider to avoid becoming a ‘perfect victim’. The chapter is interesting and well written, but, given the wide scope of the topic under consideration, can only provide an amuse-bouche, not a hearty meal.

Chapter 6, as mentioned, affords a fascinating glimpse into the IT and computational complexity attending to the computation of VPIN when ‘big data’ are at play. The authors (all researchers at the Lawrence Berkeley National Laboratory, to which one of the editors (MLdP) has also been affiliated) also analyse the predictive power (in terms of false positives and false negatives) of ‘big-data VPIN’ as a leading indicator of market toxicity. Perhaps incongruously, the very good and important Chapter 7 on liquidity and toxicity contagion comes after the study on how to compute VPIN.

As explained in the introductory account, institutional investors are particularly fearful of providing through their buying and selling orders information to ‘predators’, i.e. to traders capable of quickly detecting and correctly interpreting the nature of the programme, and front running the asset managers. It therefore becomes very important to understand to what extent algorithmic trading ‘leaks’ information to predators, and this is the topic covered by the Goldman Sachs authors of Chapter 8. By looking at programmes executed using their own proprietary (GSET) algorithmic execution algorithm, they conclude that ‘these algo executions do not leak information’.Footnote19 This is interesting, and I do not have any reason to doubt the integrity and scientific correctness of the study, but I would have preferred a third party, and not the same group who ultimately develops and markets the algorithms, to test the hypothesis.

Chapter 9 explains how a price process can be decomposed into a transitory and permanent component, and present a methodology that is particularly suited to investors (such as asset managers) who slice and dice their orders to avoid the information leakage that is the topic of Chapter 8. It also presents a definition of the elusive ‘theoretical price’ that informs much of the formalization of microstructure analysis. Given its foundational value, I would have placed this useful chapter nearer to the beginning, but I am well aware that no arrangement of material is perfect.

Chapter 10, by academics at Cambridge, Cornell and the LSE, is very good, and I would have preferred if it had been given more space, and its themes had been developed more fully. It looks at the types of regulation that one may consider if one wanted to retain the positive spin-offs of high-frequency trading, but avoid the potential negative systemic externalities. Indeed, one important topic of the chapter—an aspect which is too-often-neglected—is the importance of ‘negative effects from infrequent, but periodic, [market] instability.’Footnote20 Unhelpfully, the debate about the costs and benefits of high-frequency trading seems to focus almost exclusively on whether this new trading setting really affords greater liquidity and thinner transactions costs to market participants (including retail investors). Authors such as Lewis (Citation2014) have built an impassioned case that this is not the case; with similar passion market participants (alas, with a transparent vested interest), but also academics, have claimed that it is an unalloyed good. Both sides of the arguments neglect that, by far, the social cost that might accrue from an amplified repeat of the May 2010 flash-crash, perhaps occurring during independently distressed market conditions, far outweighs the fraction-of-basis-point narrowing of the bid-offer spread. The several mini-crashes experienced so far can either be seen as ‘proof’ that in the end the system works; or they could be interpreted as near misses, that should not make us complacent but worried. The contributors to Chapter 10 explore and insightfully discuss several ways in which some sand may be thrown into the high-frequency-trading engine to make it more stable, but they recognize that, by its very nature, the problem is an ever-changing adaptive target.

To conclude, a few quibbles, mostly, but not all, small.

To start with the small ones: the quality of the figures is mixed, and indifferent on average. Some choices of display (such as the 3D rods in figure 5.3) seem designed to minimize the amount of information conveyed. And, since the book is printed in black-and-white, greater care should have been given to how fifty shades of grey would look in a graph such as figure 4.2. One does not have to be a convert to Tufte’s (Citation2006) ‘beautiful evidence’ vision to aspire to producing more information-rich and easy-to-read illustrations than the Excel-produced figures found in some of the chapters.

Second, the acronyms (which, as my first editor taught me, are easy on the writer but heavy on the reader) are far too frequent: as the book is clearly intended for non-specialists, this can sap the will to live (or, at least, to read on) of the less dogged reader.

Last, but, alas, more importantly, in one chapter (Chapter 5), one of the descriptions of market venues (for Oanda) reads in parts like an infomercial.Footnote21 This is particularly unfortunate, because a quick Google search reveals that one of the authors (R.B. Olsen), whose affiliation in the book is coyly displayed as ‘Olsen Ltd’, is one of the co-founders of Oanda. A bit more transparency and disclosure would not have gone amiss.

Despite these minor blemishes (and the not-so-minor one mentioned above), the book is well-written and carefully edited, and provides much-needed insight in a poorly understood area of finance. Some of the chapters (especially, but not only, those written by the editors) are top quality. Among these, some of the contributions helpfully frame the debate about the social benefits and dangers posed by high-frequency trading. There is far more to the book than this, but this aspect alone would make it a very useful contribution to the existing literature.

Riccardo Rebonato
Oxford University, UK
© 2015, Riccardo Rebonato

Acknowledgements

It is a pleasure to acknowledge useful comments from Prof Donald MacKenzie.

Notes

1 MacKenzie (Citation2014), p. 15.

2 Page 2, quoted in MacKenzie (Citation2014).

3 It is important to stress that the omerta’-inspired behaviour encountered in high-frequency trading is not found in other areas of financial modelling and research, such as derivatives pricing. This, I believe, is no accident, but it would take too long a detour to explain why these different behaviours have become established. The interested reader is referred to Rebonato (Citation2013).

4 For instance, in their introductory chapter (p. xvi), the editors mention the expenditure of ‘hundreds of millions of US dollars to lay a new cable under the Atlantic Ocean’ to shave off a few milliseconds in the communication of trading order. They then rather elliptically remark: ‘[i]t is only natural to question whether such expenditures are socially optimal’.

5 If the market maker has to remain committed to making a two-way market in all market conditions, the attending capitalization requirements can be onerous indeed. Not so for the fair-weather market maker. As we shall see, this has profound implications for the quality of the liquidity provision, i.e. for the availability of liquidity in conditions of market distress (when most needed).

6 The picture I have painted bears a good resemblance to how reality worked, but should not be taken too literally. In periods of severe market dislocations, even old-fashioned market makers have been known to lose the ability to ‘pick up the phone’. It must be added, however, that, in the ‘old days’, the social and institutional memory of who the fair-weather market makers was and who could truly be relied to provide liquidity when needed remained vivid in the trading community for a long time.

7 ‘Large’ in this context means several multiples of the typical market size.

8 See, Easley, De Prado and O’Hara, p. 14 and passim. See also Chapter 5 of the book under review.

9 Roughly speaking, this is the (unobservable) fundamentals-linked price that would obtain absent the market impact. For a more precise discussion, see Chapter 9 in Easley, De Prado and O’Hara.

10 See MacKenzie (Citation2014), p. 9–10 for a short but good discussion of these points.

11 For a discussion of the differences between liquidity-making and liquidity-taking (a distinction that, as McKenzie (Citation2014) points out is ‘freighted with moral significance’), see McKenzie (Citation2014), p. 25 and passim.

12 Easley, De Prado and O’Hara, p. 9, emphasis added.

13 Page xvi, Easley, De Prado and O’Hara.

14 Mackenzie (Citation2014), p. 10.

15 Even if rarely publicized, these are rather common occurrences: the sudden fall and rise of the 10-year US Treasury over a few-hour period in late 2014 is the best example after the Flash Crash of May 2010.

16 Kirilenko et al. (Citation2011) look in detail at the May 2010 flash crash and analyse how a large sell order generated a range of coordinated responses from a variety of diverse market participants.

17 To my knowledge, Mandelbrot (Citation1973) was among the first to point out that when an appropriate ‘local time’ different from ‘clock time’ was used, the resulting distribution of price changes was closely approximated by a Gaussian distribution. Recent developments in the probabilistic description of price process for derivatives-pricing purposes that make use of deterministic or stochastic time changes employ a similar intuition.

18 Easley, De Prado and O’Hara, p. xvii.

19 Page 159.

20 See, in particular, p. 209 and passim.

21 The reader is even informed that Oanda pays interest on a second-by-second basis.

References

  • Durbin, M., All About High-Frequency Trading, 2010 (McGraw-Hill: New York).
  • Kirilenko, A., Kyle, A., Samadi, M. and Tuzun, T., The flash-crash: The impact of high-frequency trading on an electronic market. Working Paper, University of Maryland, College Park, 2011.
  • MacKenzie, D., A sociology of algorithms: High-frequency trading and the shaping of markets. Submitted to Am. J. Sociol, 2014. Available online at: www.sps.ed.ac.uk/staff/sociology/mackenzie_donald (accessed 19 January 2015).
  • Lewis, M., Flash Boys: Cracking the Money Code, 2014 (Penguin: London).
  • Mandelbrot, B., Comments on: “A Subordinated Stochastic Process Model with Finite Variance for Speculative Prices,” by Peter K. Clark. Econometrica, 1973, 41(1), 157–159.10.2307/1913890
  • Rebonato, R., How derivatives and risk models really work: Sociological pricing and the role of co-ordination. SSRN Working Paper, 2013. Available online at: http://ssrn.com/abstract=2365294 ( accessed 16 January 2015).
  • Tufte, E.R., Beautiful Evidence, 2006 (Graphic Press LLC: Cheshire, CT).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.