25 Years of IPO


Speed or Excess?

Prof. Dr. Peter Gomber

Financial markets thrive on speed. But when is it a necessary tool – and when an irrational rush?

Finance is an information business. The use of technologies that enable rapid transmission and processing of information is therefore as old as financial markets themselves. As early as 1850, Paul Julius Reuter managed to transmit news and stock prices between Paris and Berlin faster than his competitors – using a combination of carrier pigeons and telegraphy. In doing so, he paved the way for what would become one of the world’s largest news agencies.

Technological advances like these have steadily accelerated securities trading and repeatedly sparked debate: What opportunities arise from ever-higher speeds – and what risks do they entail? Who benefits, and who may be put at a disadvantage? At stake is the broader question of how continuous acceleration reshapes the nature of markets themselves.

The rise of electronic trading systems

Every major innovation in computing and telecommunications has left an immediate imprint on information flows in financial markets. First telegraphy, then the telephone enabled the near-instant transmission of prices between geographically dispersed markets, making arbitrage possible. In the early 20th century, electrified stock tickers and standardized ticker symbols allowed for the continuous, machine-readable distribution of real-time prices. Later, global satellite communication opened up arbitrage opportunities across time zones and contributed to the emergence of Eurodollar and derivatives markets. In 1971, the NASDAQ went live, displaying prices for 2,500 OTC securities. In London, a similar system was introduced in 1986 with SEAQ. Both NASDAQ and SEAQ were purely quotation systems: they supported telephone-based price negotiation but did not yet electronically match buy and sell orders.

Beyond these innovations in information transfer, the introduction of electronic trading systems was the decisive step for today’s financial markets. Exchange operators that recognized the potential of these systems early on benefited in particular. They saw the advantage of shifting from human trading with a few privileged participants to electronic, open order books. This enabled them to gain significant competitive advantages. Deutsche Börse Group was one of these pioneers.

In Germany, IBIS was launched in 1989, followed by the German Futures and Options Exchange (Deutsche Terminbörse – DTB) in 1990 and Xetra in 1997. Unlike IBIS – where order matching was not yet electronic but relied on the manual selection of offers at the touch of a button – Xetra, like the DTB as the predecessor of Eurex, is a fully electronic trading system that automatically executes matching orders against each other. The core element of an electronic trading system is the central, open limit order book, which provides a transparent, anonymous, and cost-efficient mechanism for aggregating and storing of open limit orders and matching them in real time via algorithms. Beginning in the early 1990s, many major securities exchanges on both sides of the Atlantic transitioned to fully electronic trading. Floor trading steadily became obsolete, and in June 1999 the DAX was calculated for the first time solely on the basis of Xetra prices. By the time Deutsche Börse Group went public on February 5, 2001, trading was already running on the fourth version of Xetra (Release 4).

The electronification of markets triggered a wave of technological innovation among market participants. On the sell side – i.e., at banks and brokers – systems were introduced to monitor price thresholds (“electronic eyes”) or to generate bid and ask quotes automatically based on predefined parameters (“quote machines”), supporting market makers and liquidity providers. On the buy side, institutional investors – especially large asset managers – began setting up electronic trading desks. The introduction of the FIX (Financial Information eXchange) protocol enabled standardized electronic transmission of trade-related messages worldwide and became the de facto standard for communication between the buy side, the sell side, and trading venues before and during execution.

Faster – and more volatile?

In the early 2000s, the sell side began deploying the first trading algorithms – software that combines real-time and historical market data to execute quantitative trading strategies autonomously, without human intervention – becoming faster, more efficient, and cheaper. As brokers realized that institutional clients such as asset managers could benefit from these advances, algorithmic trading was increasingly offered to the buy side as a service, allowing large orders to be intelligently sliced and distributed over time and across fragmented markets.

From around 2006, the sell side began using exchange colocation and proximity services, for example for trading on Xetra and Eurex. The objective was to meet their own need – and that of their buy-side clients – to minimize transmission latency between order submission and order arrival or execution. Colocation refers to placing one’s own trading algorithms in stock exchange data centers. Proximity services involve locating systems as close as possible to those centers. The term high-frequency trading emerged, initially within expert circles. In high-frequency trading, market participants – known as high-frequency traders (HFTs) – use proprietary capital and ultra-fast execution to capture small margins on individual trades that add up to substantial aggregate trading profits. A significant event in the development of securities trading was a phenomenon known as the flash crash in the United States. In May 2010, the Dow Jones Industrial Average lost almost 10% of its market capitalization within minutes, but recovered quickly. Individual securities saw dramatic price fluctuations. The flash crash sharply intensified the public and regulatory debate about the advantages – and especially the disadvantages – of trading technologies.

To many observers – and to the general public – trading in milliseconds, microseconds, or even nanoseconds appears inherently destabilizing and systematically disadvantageous to retail investors and non-HFT participants. Yet it is crucial to distinguish carefully between different HFT strategies.

Who is reaching for the cash?

One of the most common HFT strategies is liquidity provision. HFT liquidity providers supply the markets with liquidity and earn profits from the spread between bid and ask prices. For these HFT strategies, speed is a core risk management tool. Each quote of a HFT liquidity provider represents a free option for other market participants: if new geopolitical, macroeconomic, or firm-specific information emerges, others can immediately trade against the liquidity provider – at the provider’s expense.

The principle can be compared to leaving a banknote on a crowded marketplace for all to see. As long as the owner – in this case the liquidity provider – keeps a hand close to it and can react quickly, the risk is limited. Step away even slightly, and the risk increases that someone else will grab the money.

Greater speed therefore reduces the liquidity providers’ risk of incurring losses. That, in turn, increases these HFTs’ willingness to provide liquidity at all – with tight bid-ask spreads and high volumes. Retail and institutional investors benefit in turn from narrow spreads, deep order books, and lower transaction costs.

Who might grab the banknote? Other market participants use latency arbitrage strategies to exploit new (public) information as quickly as possible in order to react immediately to changes in market valuations and execute the liquidity provider’s quote at a profit before that provider can react to the news by adjusting its quote. In metaphorical terms, they try to snatch the banknote before its owner can react.

Market operators can intervene here. By introducing small speed limits for those attempting to execute against quotes, exchanges can give liquidity providers enough time to respond to new information. This allows the liquidity provider to decide whether to leave the banknote (their quote) on the marketplace or withdraw it. Such mechanisms – often called speed bumps – have been implemented by venues including IEX in the U.S., Toronto Stock Exchange, and in the form of “passive liquidity protection” at Eurex, with the aim of improving liquidity through tighter spreads.

Risk management or market abuse?

Another widespread HFT strategy is arbitrage. Traders exploit price discrepancies and market imbalances – for example between the same security traded on different venues, or between derivatives or ETFs and their underlying assets. These strategies require algorithms that can detect these imbalances instantly and place orders at high speed on the respective marketplaces or in the respective securities. Here, too, speed serves as risk management: it reduces the likelihood that only one leg of an arbitrage trade is executed. For institutional and retail investors, this arbitrage in turn increases the synchronization and efficiency of price formation. The broadly aligned price levels enable favorable execution conditions on various marketplaces without the need for technical or regulatory links between them.

There is no doubt, however, that some strategies use speed to identify and exploit other participants’ trading intentions, flood markets with fake orders, or create misleading impressions of supply and demand to gain an advantage for one’s own strategy. This is not risk management – it is market abuse. Such strategies are explicitly prohibited under the European Market Abuse Regulation and its national implementations, and must be effectively combated and sanctioned by supervisory authorities.

High-frequency trading leverages cutting-edge technology in market access, market data availability, and order routing to maximize the returns of established trading strategies. High-frequency trading is therefore a technical means of implementing established trading strategies. Discussions and assessments of high-frequency trading should therefore focus primarily on the underlying strategies. It is important to keep an eye on the effects of these strategies on price efficiency, market quality, and market liquidity for all market participants. For many of these strategies, intelligent use of new technologies and competition for speed are not ends in themselves, but a central and necessary tool in risk management.


Prof. Dr. Peter Gomber is Professor of e-Finance at Goethe University Frankfurt and Co-Chair of efl – the Data Science Institute. He is a member of the Exchange Council of the Frankfurt Stock Exchange, the Supervisory Board of Clearstream Europe AG, and a research fellow at the Leibniz Institute SAFE. His research focuses on market microstructure, digital finance/fintech, and electronic securities trading