We, in healthcare, lag in computing technology and sophistication vs. other fields. The standard excuses given are: healthcare is just too complicated, doctors and staff won’t accept new ways of doing things, everything is fine as it is, etc… But we are shifting to a new high-tech paradigm in healthcare, with ubiquitous computing supplanting or replacing traditional care delivery models. Medicine has a ‘deep moat’ – both regulatory and through educational barriers to entry. However, the same was said of the specialized skill sets of the financial industry. Wall St. has pared its staffing down and has automated many jobs & continues to do so. More product (money) is being handled by fewer people than before, an increase in real productivity.

Computing power in the 1960’s-1970’s on Wall street was large mainframe & mini-frame systems which were used for back-office operations. Most traders operated by ‘seat of your pants’ hunches and guesses, longer term macro-economic plays, or using their privileged position as market-makers to make frequent small profits. One of the first traders to use computing was Ed Seykota, who applied Richard Donchian’s trend following techniques to the commodity markets. Ed would run computer programs on an IBM 360 on weekends, and over six months tested four systems with variations (100 combinations), ultimately developing an exponential moving average trading system that would turn a $5000 account into $15,000,000.(1) Ed would run his program and wait for the output. He would then manually select the best system for his needs (usually most profitable). He had access to delayed, descriptive data which required his analysis for a decision.

In the 1980’s – 1990’s computing power increased with the PC, and text-only displays evolved to graphical displays. Systems traders became some of the most profitable traders in large firms. Future decisions were being made on historical data (early predictive analytics). On balance well-designed systems traded by experienced traders were successful more often than not. Testing was faster, but still not fast (a single security run on a x386 IBM PC would take about 8 hours). As more traders began to use the same systems, the systems worked less well. This was due to an ‘observer effect’., with traders trying to exploit a particular advantage quickly causing the advantage to disappear! The system trader’s ‘edge’ or profitability was constantly declining, and new markets or circumstances were sought. ‘Program’ trades were accused of being the cause of the 1987 stock market crash.

There were some notable failures in market analysis – Fast Fourier Transformations being one. With enough computing power, you could fit a FFT to the market perfectly – but it would hardly ever work going forward. The FFT fails because it presumes a cyclical formula, and the markets while cyclical, are not predictably so. But an interesting phenomenon was that the better the fit in the FFT, the quicker and worse it would fall apart. That was due to the phenomenon of curve-fitting. ‘Fractals’ were all the rage later & failed just as miserably – same problem. As an aside, it explains why simpler linear models in regression analysis are frequently ‘better’ than a high-n polynomial spline fit to the data, particularly when considered for predictive analytics. The closer you fit the data, the less robust the model becomes and more prone to real-world failure.

Further advances in computing and computational statistics followed in the 1990’s-2000’s. Accurate real-time market data became widely available and institutionally ubiquitous, and time frames became shorter and shorter. Programs running on daily data were switched to multi-hour, hour, and then in intervals of minutes.The trend-following programs of the past became failures as the market became more choppy, and anti-trend (mean reversion) systems were popular. Enter the quants – the statisticians.(2) With fast, cheap, near-ubiquitous computing, the scope of the systems expanded. Now many securities could be analyzed at once, and imbalances exploited. Hence the popularity of ‘pairs’ trading. Real-time calculation of indices created index arbitrage, which were able to execute without human intervention.

The index arbitrage (index-arb) programs relied on speed and proximity to the exchanges to have advantages in execution. Statistical Arbitrage (Stat-arb) programs were the next development. These evolved into today’s High-Frequency-Trading programs (HFT’s) which dominate systems trading These programs are tested extensively on existing data, and then are let loose on the markets to be run – with only high-level oversight. They make thousands of trading decisions a second, incur real profits and losses, and compete against other HFT algorithms in a darwinian environment where the winners make money and are adapted further, and the losers dismissed with a digital death. Master governing algorithms coordinate individual algorithms. (4)

The floor traders, specialists, market-makers, and scores of support staff that once participated in the daily business have been replaced by glowing boxes sitting in a server rack next to the exchange.

Not to say that automated trading algorithms are perfect. A rogue algorithm with insufficient oversight caused a forced sale of Knight Capital Group (KCG) in 2012. (3) The lesson here is significant – there ARE going to be errors once automated algorithms are in greater use – it is inevitable.

So reviewing the history, what happened on wall st.?

1. First was descriptive analytics based upon historical data.

2. Graphical Interfaces were improved.

3. Improving technology led to more complicated algorithms which overfit the data. (WE ARE HERE)

4. Improving data accuracy led to real-time analytics.

5. Real time analytics led to shorter analysis timeframes

6. Shorter analysis timeframes led to dedicated trading algorithms operating with only human supervision

7. Master algorithms were created to coordinate the efforts of individual trading algorithms.

Next post, I’ll show the corollaries in health care and use it to predict where we are going.

(1) Jack Schwager, Market Wizards, Ed Seykota interview pp151-174.

(2) David Aronson, Evidence-based Technical Analysis, Wiley 2007

(3) Wall St. Journal, Trading Error cost firm $440 million, Marketbeat

(4)Personal communication, HFT trader (name withheld)

Share this: Twitter

Facebook

Tumblr

Reddit

Print

Email

