Algorithms from wartime to Wall Street

article and photo by Geoff Olson

• The Hungarian-born émigré and mathematician John von Neumann is remembered as an urbane and witty man. His colleagues admired his finely tailored clothes and superhuman capacity to handle liquor, to say nothing of his important contributions to a wide range of fields, from game theory to quantum physics. The man’s bald dome housed one of the most powerful brains of the twentieth century.

Von Neumann’s career arc took him from the top secret Manhattan Project to the Princeton Institute for Advanced Study, where Albert Einstein also worked. The hawkish Hungarian did not share the frizzy-haired German’s fears about nuclear weapons. After the horrors of Hiroshima and Nagasaki, von Neumann approached the US government with a proposal for a new computer that would overtake its hand-driven predecessor ENIAC (Electronic Numerical Integrator and Computer) in running the mathematical simulations essential to atomic testing.

Einstein petitioned against building the machine at Princeton, but the US government approved the proposal. In 1951, von Neumann’s team unveiled the Mathematical and Numerical Integrator and Computer, known by the acronym MANIAC – meaning something crazy and uncontrollable.

“Because city-destroying bombs couldn’t be built by trial and error, computers were required to simulate the physics of detonation and blast waves. A computer helped build the bomb and the bomb necessitated ever more advanced computers,” observes author William Poundstone in the New York Times. MANIAC was the bulky, slow-moving granddaddy of today’s supercomputers, desktop systems, notebooks and smart phones. Most of our computing gadgets owe their attention-fracturing existence to John von Neumann and the Cold War’s nuclear stalemate.

A Punch cartoon from 1959 portrays two scientists in lab coats standing next to a huge mainframe computer that has been programmed to answer the question, “Is there a God?” They are holding a printout of the computer’s response: “There is now.” A mere half-century after that unnerving punch line, computers have come to dominate our lives in uncanny ways. Who would have thought that the humdrum telephone – a Canadian inventor’s century-old brainchild – would break free from the home to become a beeping, buzzing vector for social connection/disconnection? Or that the architecture of the processors in smart phones is traceable back to von Neumann’s MANIAC, a device designed to crank out estimates for nuclear blast range and fallout?

Yet even while our phones and computers went mobile, other changes were happening on the digital front, far below the public radar. These changes make for a story with mythic dimensions, which we are only partway through. Just as the Biblical creation myth involved a Tree of Knowledge, there’s also a tree of knowledge in this unfinished tale and, as an added bonus, it has a colourful selection of serpents.

Over 200 hundred years ago, businessmen in breeches and buckled shoes regularly gathered at a buttonwood tree at the foot of Wall Street, to wheel and deal. On May 17, 1792, under the tree’s dappled shade, 24 stockbrokers signed “The Buttonwood Agreement,” initiating the New York Stock & Exchange Board at 68 Wall Street.

Proximity to the area was key. Firms set up broker offices near the exchange so their employees could run over as fast as possible to buy and sell. Today, subatomic particles perform the legwork on Wall Street. Consider this: it takes you approximately 350,000 microseconds to blink and 500,000 microseconds to click a mouse. Computers can now perform trades in the span of just a few microseconds. Needless to say, even a bipolar, coked-up day trader can’t touch the cheapest Chinese-made netbook in response time. So what has this meant for Wall Street, where timing is critical in the buy-and-sell process?

It’s meant that Wall Street has given decision-making over to the machines. Every day, electrons race across electronic networks at near the speed of light to perform financial transactions. It’s been called “black box trading” or “high frequency trading.” The decisions to buy and sell are made by computer programs. These automatic computing procedures are called algorithms: coded sets of rules that define a precise sequence of operations.

These procedures are often massively repetitive. In consumer-grade software, algorithms repeat their routines over and over, until, for example, they have compressed a high-res photographic image into a web-friendly jpeg, or shoehorned a CD track into a tinny-sounding mp3. In high-frequency trading, algorithms continually sniff out stocks according to pre-programmed criteria. Amazingly, 70 percent of equity trades within the US in 2010 were by HFT.

That’s right. Most of the market activity on Wall Street is performed by machines, not human beings.

Here’s how it works: in the morning, a firm decides on a trading strategy for the day. The firm then releases its computational hounds into the secure, electronic backbone of the exchange. Millions of shares may be bought and sold throughout the day, but a young manager manning one of the HFT ‘special desks’ may know little to nothing about the value of the companies involved. He or she is only there to watch the screens and pull the plug if market activity gets out of hand. At the ring of the market bell, the firms close their position and rake in the bucks.

The HFT traders aren’t after a few big fish; they’re targeting an ocean’s worth of minnows. The gains and losses are tiny per trade, just a fraction of a penny per share or currency unit. Yet with algorithms darting in and out of short-term positions millions of times a day, and the firms liquidating their entire portfolios daily, the takings add up. Goldman Sachs netted $300 million in 2009 and Citadel hedge fund made $1 billion in 2008 from high-speed strategies, notes Sarah Anderson of the Institute for Policy Studies.

According to a study by TABB Group, 48 percent of HFT has been traced to a few hundred proprietary trading houses, 46 percent by investment banks and six percent by a dozen or more hedge funds.

Paper currency originated as a virtual representation of precious metals and electronic currency acts as a virtual representation of paper currency. HFT, which gambles on electronic currency free of human oversight, is virtualization run amok. It’s not so much trading as Tron. Critics say it has created “dark pools” and a kind of “Shadow Wall Street” where transparency and fairness is trumped by cabbalistic code.

HFT first hit the news after May 6, 2010, when nine percent of the DOW Jones Industrial Average momentarily vanished, almost plunging the world into chaos. The 1,000-point drop was later traced to a mutual fund market that dumped $4.1 billion of securities in a 20-minute period, which were gobbled up and sold within microseconds by algorithms. The incident became known as the “Flash Crash,” after the US Securities and Exchange Commission suggested some blame lay with a variant of high speed trading called flash trading. (This is when selected players are allowed to see incoming orders to buy or sell securities a few microseconds earlier than the general market participants, in exchange for a fee.)

Could such untracked activity leverage a market mood swing into another financial meltdown? A 2011 survey of global financial firms found that 67 percent of executives believe that “rogue algorithms” are inescapable, versus 78 percent of US financial executives surveyed.

It’s not like Wall Street code hasn’t got us into trouble before. The granddaddy of financial algorithms, the so-called Black-Scholes equation, won its creators the 1997 Nobel Prize in Economics. Unfortunately, the equation makes no allowance for “Black Swan” events like market crashes. Ian Stewart, a respected science writer and professor of mathematics at the University of Warwick, insists the Black-Scholes equation is dumb as dirt in dealing with real world market gyrations, and may even amplify them.

“Any mathematical model of reality relies on simplifications and assumptions. The Black-Scholes equation was based on arbitrage pricing theory, in which both drift and volatility are constant. This assumption is common in financial theory, but it is often false for real markets,” Stewart observed in the Guardian. Financial managers who use ever more complicated derivatives – bets on bets on bets – eventually became prisoners of their instruments, like Mickey Mouse and his out-of-control brooms in the Disney film Fantasia.

Herd-driven market crashes are “virtually impossible under the model’s assumptions,” notes Stewart. The professor doesn’t blame the great subprime scam and credit crisis of ‘08 on bad math alone. “Black-Scholes may have contributed to the crash, but only because it was abused. In any case, the equation was just one ingredient in a rich stew of financial irresponsibility, political ineptitude, perverse incentives and lax regulation.”

The mathematician sees only more problems in hyper-speed finance. “The facility to transfer billions at the click of a mouse may allow ever-quicker profits, but it also makes shocks propagate faster,” he observes.

Critics accuse some HFT traders of ‘front running’ and other illegal activities. Andy Brooks, head of US stock trading at the mutual fund seller T. Rowe Price, hinted at the dimension of the problem. “We know that some high-frequency trading strategies have cancellation rates in the 95 percent range,” he told the Baltimore Sun this year. “So that means that 95 percent of the time that you say you want to buy 100 shares of IBM, you don’t really buy it. And that begs the question: Why have you said you want to buy? Are you trying to influence someone to do something else? And is that manipulative?”

In 2010, the Financial Services Authority in Britain fined one firm and froze the assets of another, for HFT abuses on the London Stock Exchange. The US Security and Exchange Commission has proposed monitoring HFT in real time with a consolidated paper trail, but for its part, the Obama Administration has not signalled much interest in doing anything to alienate the incumbent’s few remaining campaign benefactors on Wall Street. The European Commission’s proposal for a financial transaction tax on speculators, which might reduce HFT volume on world exchanges, has gone nowhere.

A 2010 study of the NYSE flash crash commissioned by Britain’s Government Office for Science contends the world narrowly avoided a “true nightmare scenario,” with the market contagion spreading to the global exchanges. The report concludes, “On the afternoon of May 6, 2010, the world’s financial system dodged a bullet.” (With the concern that HFT is playing chicken with a “Black Swan Event,” exchanges have reportedly installed post-flash crash ‘circuit breakers’ to halt trading in the event of extreme volatility.)

In a must-see 2011 talk on algorithms delivered at the Technology Entertainment and Design forum (TED), entrepreneur and new media maven Kevin Slavin insists the flash crash of 2010 indicates we have written code “we can no longer read. And we’ve rendered something illegible. And we’ve lost the sense of what’s actually happening in this world that we’ve made.”

This goes far beyond Wall Street. As an example, Slavin cites an anecdote from e-commerce, when a book for sale at Amazon.com, The Making of a Fly: The Genetics of Animal Design, rose from $1.7 million to $23.6 million in the space of a few hours. “When you see this kind of behaviour, what you see is the evidence of algorithms in conflict, algorithms locked in loops with each other, without any human oversight, without any adult supervision to say, ‘actually, $1.7 million is plenty,’” Slavin dryly observes.

UK software engineers now offer story algorithms to Hollywood. A company called Epagogix can run a script through its proprietary code to quantify whether it portends a $30 million movie or a $200 million movie. This is no longer about finance, it’s about “the physics of culture,” notes Slavin. “And if these algorithms, like the algorithms on Wall Street, just crashed one day and went awry, how would we know, what would it look like?”

We know already that high speed trading is affecting the hard-edged world of people and property. In a weird replay of the buttonwood tree era, trading distance is more crucial than ever. A distance of 20 miles from a high-speed trader to the exchange means a few extra microseconds, a critical amount of travel time for algorithms. Financial firms are locating their offices as close as possible to the NYSE hub to shave off these microseconds and the NYSE itself has installed supercomputers in its basement for high-paying clients.

In a world of whizzing financial code, what is the “market value” of slow, squishy human beings, with their circadian rhythms that evolved in tandem with a slowly turning planet? Not much. In his TED talk, Slavin recalls meeting “an architect in Frankfurt who was hollowing out a skyscraper – throwing out all the furniture, all the infrastructure for human use, and just running steel on the floors to get ready for the stacks of servers to go in” – just so algorithms could be closer to the financial electronic networks. For the same area of leased space, a human being squeezes out far less profit than a supercomputer firing off digits at the stock market.

Over the past few years, a company called Spread Networks has built an 825-mile trench between New York City and Chicago. This massive project is for a cable to transport algorithms 37 times faster than you can click a mouse. One of the newer projects in this field is a $300 million fibre-optic line beneath the Atlantic Ocean, intended to shave a few milliseconds off the data transmission time between London and New York markets. Algorithms have “a kind of manifest destiny” that will always seek out a new frontier,” Slavin observes. Along with nature and man, there is now a “third co-evolutionary force,” he says: the algorithm.

Many of us feel life is going faster and faster these days. The intuition isn’t without a real-world basis. Financial market time operating in microsecond cycles is driving our stock-obsessed 24-hour news channels and Internet blogs, which in turn demand a manic, megahertz pace from political campaigns and pop culture. And God help the hapless consumer if he or she is not fully wired and connected at all times to “the Cloud.”

The coming out party for John von Neumann’s MANIAC arrived in the summer of 1951, “with a thermonuclear calculation that ran for 60 days nonstop,” writes George Dyson in the 2012 book Turing’s Cathedral. The author quotes von Neumann’s second wife, Klari, who recalls his anxiety over what his invention might mean for the world. One evening in 1945, the mathematician proclaimed, “What we are creating now is a monster whose influence is going to change history, provided there is any history left.” Von Neumann wasn’t so much concerned about the atomic bomb per se as “the growing powers of machines,” Dyson observes.

“Is there a God?” scientists asked a huge mainframe computer in an ancient Punch cartoon. No real-world device could answer such a question in the era of von Neumann, who passed away in 1957. But perhaps the algorithms racing through the world’s electronic networks will one day give our machines a voice. Actually, in a sense, they have already through Apple’s smart phone app, SIRI. Not surprisingly, the interactive voice recognition software originated as a project funded by DARPA (Defence Advanced Research Program Agency), a Pentagon nursery for classified high technology.

I’m afraid to ask SIRI if there’s a God.

www.olscribbler.wordpress.com

Leave a comment