The Rise of Algorithmic Trading

Algorithmic trading, the use of computer programs to make trading decisions and execute orders, accounts for more than 60% of U.S. equity trading volume in the mid-2020s. High-frequency trading (HFT), the fastest subset of algorithmic trading, operates on timescales measured in microseconds and has fundamentally changed market structure, liquidity provision, and the economics of exchanges. The shift from human traders shouting on exchange floors to algorithms competing at the speed of light happened over roughly four decades, driven by advances in computing, changes in market regulation, and the relentless pressure to reduce transaction costs.

The Precursors

Computers entered financial markets gradually. In the 1970s, institutions began using simple programs to split large orders into smaller pieces and execute them over time, reducing the market impact of large trades. These early "program trading" systems were unsophisticated by modern standards, but they represented the first use of automation in order execution.

Portfolio insurance, developed by Hayne Leland and Mark Rubinstein in the early 1980s, used computerized trading rules to replicate the payoff of a put option by dynamically adjusting a portfolio's stock and cash allocations. As stock prices fell, the program sold stocks and bought bonds. As prices rose, it reversed the process. By 1987, portfolio insurance strategies managed an estimated $60 billion to $90 billion in assets.

The strategy's flaw was revealed on Black Monday, October 19, 1987. As the market declined, portfolio insurance programs generated massive sell orders, which pushed prices lower, which triggered more sell orders, creating a self-reinforcing spiral. The Dow fell 22.6% in a single session. The Brady Commission, appointed to investigate the crash, identified portfolio insurance as a contributing factor. The episode demonstrated that automated trading strategies could amplify market moves in ways that their designers had not anticipated.

Regulation and Electronic Markets

Two regulatory changes in the 1990s and 2000s created the conditions for algorithmic trading's rapid growth.

The SEC's Order Handling Rules of 1996 required market makers on Nasdaq to display orders from electronic communication networks (ECNs) when those orders offered better prices than the market maker's own quotes. This rule broke the market makers' oligopoly on Nasdaq, where spreads had been artificially wide. ECNs like Instinet and Island emerged as competing venues, offering faster, cheaper execution.

Decimalization in 2001 replaced the old system of quoting prices in fractions (one-eighth and one-sixteenth of a dollar) with penny increments. This was part of a broader evolution of stock exchanges toward electronic trading. This compressed bid-ask spreads dramatically. The average spread on NYSE-listed stocks fell from roughly 6 cents to 1-2 cents. Narrower spreads meant less profit per trade for market makers, which favored high-volume, automated strategies over human traders who could handle only limited order flow.

Regulation NMS (National Market System), adopted by the SEC in 2005, was the most significant structural change. It mandated that orders be routed to the venue offering the best price (the "Order Protection Rule"), regardless of which exchange the stock was listed on. This rule knitted together the fragmented U.S. equity market into a single interconnected system and created powerful incentives for speed. The first firm to react to a price change on one exchange could profit by trading on another exchange before the new price was reflected there.

The Speed Race

The combination of electronic markets, narrow spreads, and inter-exchange competition created an arms race for speed. Trading firms invested heavily in technology to reduce the time between receiving market data and executing a trade, a measurement known as latency.

In the early 2000s, latencies were measured in milliseconds (thousandths of a second). By the late 2000s, they were measured in microseconds (millionths of a second). By the 2010s, firms were pursuing nanosecond (billionths of a second) advantages.

The investments required to achieve these speeds were enormous. Firms colocated their servers in the same data centers as exchange matching engines, paying premium rents for the physical proximity that shaved microseconds off transmission times. They built dedicated fiber-optic and microwave communication links between exchanges in different cities. Spread Networks, a firm profiled in Michael Lewis's 2014 book "Flash Boys," spent $300 million to build a fiber-optic cable between Chicago and New Jersey along the straightest possible route, reducing round-trip transmission time from approximately 17 milliseconds to 13 milliseconds.

Microwave towers subsequently replaced fiber-optic cables for the fastest applications, because radio waves travel through air faster than light travels through glass. Firms erected microwave relay towers along the route between major exchange data centers, shaving additional microseconds. The technology race extended to laser communication systems and even to experiments with neutrino-based transmission that could theoretically pass through the Earth in a straight line.

How Algorithms Trade

Algorithmic trading encompasses a wide spectrum of strategies, from simple execution algorithms to complex predictive models.

Execution algorithms, sometimes called "agency algorithms," are used by institutional investors to execute large orders with minimal market impact. A pension fund selling a million shares of a mid-cap stock uses an algorithm to slice the order into small pieces and execute them over hours or days, varying the timing and size of each slice to avoid signaling the fund's intentions to the market. Common execution algorithms include VWAP (volume-weighted average price), TWAP (time-weighted average price), and implementation shortfall algorithms.

Market-making algorithms provide liquidity by continuously quoting bid and ask prices. They profit from the bid-ask spread, buying at the bid price and selling at the ask price. Automated market makers adjust their quotes in response to order flow, inventory levels, and changing market conditions, typically maintaining positions for very short periods. Virtu Financial, one of the largest HFT firms, reported a profitable trading day on 1,277 out of 1,278 trading days between 2009 and 2014.

Statistical arbitrage algorithms identify and exploit temporary price discrepancies between related securities. If two stocks that normally move together diverge, the algorithm buys the relatively cheap one and sells the relatively expensive one, betting on convergence. These strategies operate across thousands of securities simultaneously and require sophisticated models of the relationships between prices.

Momentum and trend-following algorithms identify short-term price trends and trade in the same direction. They may hold positions for seconds, minutes, or hours, depending on the frequency of the strategy.

The Flash Crash

The flash crash of May 6, 2010, was the most dramatic demonstration of algorithmic trading's impact on market stability. At approximately 2:32 PM Eastern Time, the Dow Jones Industrial Average began a rapid decline. Within minutes, the Dow had fallen nearly 1,000 points (approximately 9%), with some individual stocks briefly trading at absurd prices. Accenture shares dropped to one cent. Procter and Gamble shares fell 37% in minutes. The market recovered most of the decline within 20 minutes, but the episode shook confidence in market structure.

The SEC and CFTC's joint report, published in September 2010, attributed the crash to a single large sell order of E-mini S&P 500 futures contracts placed by a Kansas City mutual fund company (later identified as Waddell and Reed). The order, executed through an algorithm that did not account for the speed of execution, overwhelmed the available liquidity. High-frequency market makers, which had been providing liquidity, withdrew from the market as conditions became chaotic. The absence of human judgment, which might have recognized the sell-off as a temporary dislocation rather than a fundamental move, allowed the decline to cascade.

The SEC responded by implementing single-stock circuit breakers (later replaced by the Limit Up-Limit Down mechanism) that pause trading in individual securities when prices move too far too fast. The flash crash also intensified the debate about whether high-frequency trading contributed to or detracted from market stability.

The Debate

The rise of algorithmic and high-frequency trading has generated a sustained debate among regulators, market participants, and academics.

Proponents argue that algorithmic trading has reduced transaction costs for investors. Bid-ask spreads are narrower than at any point in market history. Trading commissions have fallen to zero for retail investors. Execution quality, as measured by the prices at which orders are filled relative to prevailing market quotes, has improved. Academic studies have generally found that the arrival of HFT in a market is associated with tighter spreads and improved price discovery.

Critics argue that the benefits come with hidden costs. The speed advantage of HFT firms allows them to detect and trade ahead of institutional orders, a practice known as "latency arbitrage." The complexity of the modern market structure, with more than 60 trading venues in the United States, creates opacity and fragmentation. The flash crash demonstrated that algorithmic traders can withdraw liquidity precisely when it is most needed, turning them from stabilizing market makers into destabilizing forces during stress.

IEX, founded in 2012 by Brad Katsuyama and featured in Michael Lewis's "Flash Boys," introduced a 350-microsecond speed bump designed to neutralize the latency advantage of HFT firms. IEX gained exchange status in 2016 and attracted a modest market share, but it did not fundamentally alter the market structure.

The Quant Arms Race

The algorithmic trading industry has become increasingly sophisticated. Simple speed-based strategies that were profitable in the early 2010s have been arbitraged away as more firms competed for the same microsecond advantages. Firms have moved toward more complex strategies that incorporate alternative data, machine learning, and natural language processing.

Renaissance Technologies, founded by mathematician Jim Simons, has been the most successful quantitative trading firm in history. Its Medallion Fund earned annualized returns of approximately 66% before fees from 1988 through the mid-2010s. The firm employs Ph.D. mathematicians, physicists, and computer scientists rather than traditional financial analysts. Its methods are proprietary and closely guarded, but they are understood to involve identifying subtle, non-obvious patterns in market data.

Two Sigma, D.E. Shaw, Citadel Securities, and Jane Street are among the other prominent firms operating in the quantitative and algorithmic trading space. These firms collectively invest billions of dollars annually in technology, data, and talent.

The talent pool has shifted accordingly. Where Wall Street once recruited primarily from business schools and economics departments, the leading trading firms now recruit from mathematics, physics, and computer science programs at MIT, Caltech, Stanford, and CMU. The skill set required to compete in modern markets has more in common with engineering than with traditional finance.

Regulation and the Future

Regulators have struggled to keep pace with the technological transformation of markets. The SEC has implemented circuit breakers, imposed registration requirements on certain proprietary trading firms, and conducted studies of market structure. The European Union's Markets in Financial Instruments Directive (MiFID II), implemented in 2018, imposed more stringent requirements on algorithmic traders, including testing requirements and kill switches.

More fundamental reforms have been proposed but not adopted. A financial transaction tax (FTT), which would impose a small tax on each trade, has been advocated as a way to reduce the volume of high-frequency trading. The European Commission proposed an FTT in 2011, but it has not been implemented across the EU. In the United States, the idea has been introduced in Congress multiple times without passage.

Batch auctions, which would replace continuous trading with periodic clearing at fixed intervals (perhaps every second or fraction of a second), have been proposed as a way to eliminate the speed race. Proponents argue that batch auctions would make speed irrelevant and reduce the resources wasted on latency competition. Critics counter that investors value the ability to trade immediately and that batch auctions would reduce liquidity.

The Human Element

The transformation of trading from a human activity to an algorithmic one has had consequences beyond market structure. Entire career paths have been eliminated. The specialist firms that once managed the order books on the NYSE floor employed thousands of people. The pit traders at the Chicago exchanges who executed futures and options contracts numbered in the tens of thousands. Most of those jobs no longer exist.

The skills valued on Wall Street have shifted accordingly. Twenty years ago, a successful equity trader needed quick reflexes, strong interpersonal skills, and the ability to manage risk under pressure. Today, the most sought-after skills are in mathematics, computer science, and data engineering. The trading floor has been replaced by the server room, and the humans who remain in the trading process occupy different roles: designing algorithms, managing risk at the portfolio level, and monitoring systems for anomalies.

The trajectory of algorithmic trading points toward increasing automation, increasing complexity, and increasing speed. The days when human traders made the majority of trading decisions are over. The question for regulators and market participants is how to ensure that the algorithms that dominate trading serve the interests of the broader market, not just the firms that deploy them.

Nazli Hangeldiyeva
Written by
Nazli Hangeldiyeva

Co-Founder of Grid Oasis. Political Science & International Relations, Istanbul Medipol University.

View full profile →

Put these principles into practice. Track fundamentals, build portfolios, and analyze stocks with AI-powered insights.

Start Free on GridOasis →