Such changes have a lot of people worried, including the Securities and Exchange Commission. It released a wide-ranging paper earlier this year seeking suggestions on how to restructure the entire equity market, and created a Division of Risk, Strategy, and Financial Innovation in part to help monitor new technologies. A market collapse in early May—in which automated-trading systems exacerbated a sell-off that drove the Dow down more than 900 points in less than an hour, before it quickly recovered—gave two worries new public salience: that the proprietors of these algos may not be in full control of their creations, and that the strategies they pursue are, in some cases, fundamentally warping the financial markets.

In January, the NYSE fined Credit Suisse $150,000 for “failing to adequately supervise the development, deployment, and operation of a proprietary algorithm.” The fine was a pittance, but more troubling was that the bank didn’t even know that its malfunctioning algo (which sent hundreds of thousands of cancel-and-replace requests for orders that hadn’t been made) had crippled some of the NYSE’s trading stations until regulators called them the next day. This spring, a newsletter from the Federal Reserve Bank of Chicago warned: “Although algorithmic trading errors have occurred, we likely have not yet seen the full breadth, magnitude, and speed with which they can be generated. Furthermore, many such errors may be hidden from public view.”

Bernard Donefer, a finance professor at Baruch College and the author of a study in the most recent Journal of Trading called “Algos Gone Wild,” contends that the speed of these equations, and their ability to reach so many markets simultaneously, could turn even a minor coding error into a spiraling disaster. “Another 1987,” he told me, referring to the epic crash caused in part by simpler automated-trading schemes. This view puts Donefer in the minority in the financial community, which tends to have more faith in firms’ internal risk controls. But he thinks that without better regulation, more algo-gone-wild scenarios are inevitable. He notes that while controls at big firms, like Citi, are generally exemplary, second- and third-tier firms present a graver risk.

The SEC wants to hire a lot more staffers, both for its new risk division and for its trading division, and it is considering new methods of tracking algorithmic trades; Donefer and others have suggested a tagging system for the biggest traders, which the SEC says is on the table. The commission also may soon outlaw a practice called “naked access,” in which some broker-dealers offer their clients direct access to exchanges—allowing them to potentially bypass risk controls—in pursuit of faster trading.

A more widespread worry, now getting increased attention from regulators and Congress, is a strategy known as high-frequency trading. Employers of this technique apply algorithms and other automated technology, along with real-time market data, to buy and sell so quickly (in microseconds) and in such quantities (millions of trades a day), that they engorge themselves on penny differentials in prices. These traders argue that they supply the market with needed liquidity and tighter spreads. Regulators tend to agree, for the most part; free markets have always rewarded better information, speed, and creativity. But this technology unloads on such a massive scale, and so quickly, that they fear it could feed a dangerous and self-reinforcing volatility.