Market forces drive the specialisation of labour and the specialisation of labour increases productivity. This insight, which has been recognised by many philosophers at different times in history, can be most clearly observed in the most advanced market economies. There is no doubt that specialisation increases productivity but at what cost? It is in answering this question that Adam Smith provides one of the staunchest critiques of free market economics. In the contentious passage in Book V of the Wealth of Nations Smith writes:

The man whose whole life is spent in performing a few simple operations, of which the effects too are, perhaps, always the same, or very nearly the same, has no occasion to exert his understanding, or to exercise his invention in finding out expedients for removing difficulties which never occur. He naturally loses, therefore, the habit of such exertion, and generally becomes as stupid and ignorant as it is possible for a human creature to become. The torpor of his mind renders him, not only incapable of relishing or bearing a part in any rational conversation, but of conceiving any generous, noble, or tender sentiment, and consequently of forming any just judgement concerning many even of the ordinary duties of private life.

I find it difficult to disagree with Smith especially given the conditions in which labour operated in his time, conditions that some may argue with justification, still exist today in many parts of the world. But how relevant is this critique to knowledge workers in the digital information age? Can stupidity be productive in a knowledge economy?

To be clear, I am an advocate of free market economics but I'm not a free market fundamentalist. I value specialisation and automation, in fact I could argue that in my professional career one area of specialisation is IT automation, a specialisation that has been motivated by a drive to eliminate the need to "perform a few simple operations, of which the effects too are, perhaps, always the same, or very nearly the same". One can argue that IT automation is helping to avoid the creation of people that may "become as stupid and ignorant as it is possible for a human creature to become" by eliminating the specialisation in repetitive cognitive processes that are better executed by a computer.

As someone whose first paid job was from around the age of about 10 (a paper round), writing computer games from the age of 8 (thank you Sir Clive Sinclair and the editors, authors and contributors to ZX Spectrum magazines) I've been playing around with IT for some time now. Many of my peers built web sites from scratch during the dotcom boom. What I mean by "from scratch" is that we bought and assembled our IT infrastructure, all of it, storage, networks, compute and then proceeded to install or write the software to operate the services we were building. In those days the minimum expectation for someone that "specialised" in IT was a broad range of skills with a high degree of aptitude to learn anything you needed to learn to get the job done.

If raising the human capacity for action is the expectation for everything digital, and I argue that it is the ultimate reason for going digital in the first place, then those among us that can take full advantage of the technology should have the capacity to grow into polymaths. As we increase our capacity to act we increase our capacity to learn and perhaps reduce Malcolm Gladwell's 10,000 hours of practice to become and expert.

Unfortunately the fetishisation of specialisation could be the biggest obstacle to this flowering of human potential. Managers recruiting 'talent' to deliver short term objectives mobilise recruiting agencies to filter candidates for key words that match the perceived desired specialisation. How successful is this approach to talent acquisition? I honestly don't know but I'd say Smiths critique of specialisation probably applies to the managers and recruiters that engage such unthinking repetitive work which according to Smith renders them "as stupid and ignorant as it is possible for a human creature to become".

From my own experience the best hires I've made have had literally no IT experience at all. Oddly enough most of them have been musicians that decided to change careers. They learned on the job and then proceeded to out compete their colleagues. The learning wasn't easy and it demanded a 10 - 14 hours a day for about 6 months to be in a position to take on a well paid IT role. For me the benefits were huge, a loyal capable team that I could trust implicitly to get the job done and deliver value. Contrast this with the hires I've made through agencies attempting to match specialised skills with requirement where I'd say only 6 out of 10 hires have met my expectations.

I find it odd that we set such low expectations for knowledge workers; the prevailing expectation amongst recruiters and those making hiring decisions seems to be that someone can only be a Big Data specialist, a Digital specialist, a Mobile development specialist, a Network specialist etc, etc, etc.

There are some more positive signs, for example the term "Full Stack Developer" and the rise of the "Polyglot Programming" recognise the potential for the polymathic knowledge worker.

I leave the reader with a video of Ian Gilbert one of the UK's leading educational innovators.