What everyone in the world wants is a good job. It's not love, shelter, security, money, happiness, or freedom, although all of these things may come from holding a job one can appreciate. That's the conclusion of Jim Clifton's new book, The Coming Jobs War. But what happens when the good jobs disappear? That's what the United States and the world are now discovering.

The cause of this shift -- the growth of technology -- is the same as what was behind earlier transformations. Yet few have devoted serious time or effort to discussing what's happening and where it's headed. This apathy is dangerous, because the stakes are so high and the alternatives so hard to see. A serious discussion of the modern employment picture can't be held without also discussing the technologies that make it possible, and where it's all leading. It's my hope to create such a discussion today.

Jobs in the headlines

Jobs reports have taken on particular significance in this election year, with each micro-statistic torn apart to find shreds of meaning. By most measures, the picture is improving. Jobs are being created. Unemployment is dropping. But many of these jobs are less secure than those lost in the crash. Many are low-paying, part-time jobs. February's jobs report showed gains of 227,000 non-farm jobs, but almost 40% of those jobs were in temporary office work or in restaurants and bars. Since jobs began returning in 2010, fully a fifth of all jobs created were in food services.

Even Apple (Nasdaq: AAPL) has jumped into the jobs debate, releasing a report this month on the jobs it's created or supported in the United States. The tally comes to 514,000, a surprisingly large number considering the fact that Apple only directly employs 47,000 people. Some people will say it's an understatement. MIT economics professor David Autor says the entire effort is "disreputable." Is Apple stretching the truth? Using Apple's specious methodology -- which tallies up UPS drivers that deliver iStuff, component manufacturers, and app developers -- Microsoft (Nasdaq: MSFT) could take credit for pretty much every job in the country that so much as involves looking at a Windows computer.

Of course, the report makes no mention of the jobs eliminated by Apple's technological advances.

A lost decade

Until the last few months of job growth kicked in, the United States had seen no net private-sector job growth at all over the course of a decade (despite growing by 30 million people), which has never before occurred on such a long time frame on record. That's not the only measure that lags past levels, and many point toward the decline of the American worker. One statistic, however, has done fabulously: productivity.

Source: U.S. Bureau of Labor Statistics.

Those still on the job are working slightly fewer hours than they did a decade ago and yet are managing to produce more than twice as much economic benefit per person. That benefit, by and large, hasn't flowed back to the ranks of the employed. The average household actually earns less annually in real terms than it did ten years ago. Real median incomes were almost exactly the same in 2010 as they were in 2000. Real national GDP has risen, however, by two trillion dollars. Much of that growth rests on two pillars of the modern economy: corporate profits and consumer credit.

Sources: U.S. Internal Revenue Service and U.S. Bureau of Labor Statistics.

If productivity gains are driving profits, technological progress, as always, is enabling greater productivity. Behind that progress is the specter of permanent labor force displacement, which comes with the convergence of several aspects of technology and the connected global economy. I've made this point before, and it's no less true today despite recent employment gains.

Manufacturing is for machines

There used to be (and there frequently still is) a dismissive term for people who make these claims: Luddite. Isn't the real problem globalization, after all? Apple outsources the assembly of its products to Foxconn, a titanic mass of poorly paid, semi-skilled, and incredibly well-regimented Chinese workers that manufactures nearly half the entire world's consumer electronics. Foxconn employs more than a million workers. We could sure use a million new jobs in the United States.

If we brought every single Foxconn job to the United States, it would still fail to restore manufacturing employment back to its 2008 totals. That's just a slice of a much larger pie that's disappeared in recent years. Since its 1979 peak, the manufacturing sector has lost 7.7 million jobs. Much of that decline occurred more recently; since the turn of the century, the sector has trimmed five and a half million workers from its payrolls.

Foxconn doesn't even want more employees. It plans to add a million robots to its factory floors by 2014, a move that makes perfect business sense when profits are paramount. Robots never eat, never sleep, never ask for raises, never get injured or suffer nervous breakdowns. Robots never jump off buildings and cause public relations headaches. Eventually, all manufacturing will be robotic from beginning to end. Just as cars replaced horses, well-designed machines can and will replace human beings on the assembly line.

Convergence toward singularity

Manufacturing employees have historically been cannon fodder for technological progress. However, the new battleground isn't on factory floors and forges. The war for work is now taking place in cubicles, malls, and offices. Robots have always been great at outmuscling and outlasting human bodies, and now a host of advancements are coming after human minds.

A fully human simulation is hardly necessary to automate most jobs, which are often routinized and reliant on specific, narrow skill sets. For every job you can imagine that relies on specific knowledge applied in a routinized way, there is an example of automation that either has or soon could replace the person now performing that job. The range of positions already under threat of automation runs the gamut from cab drivers and waiters to lawyers and doctors, with plenty of mid-level white collar wage slaves at risk as well. The few jobs resistant to digital encroachment are those requiring high-level abstract thought, and those positions will never be feasible or necessary for the majority of the population for two reasons that go beyond the accelerating pace of technological change.

The broken bridges

First, the American education system is completely inadequate to meet the needs of a broadly and highly skilled and technologically connected workforce. American universities still maintain top global prestige, but only 28% of Americans have college degrees. The path to that degree is costly (placing 12th out of 15 developed countries in final post-subsidized costs), and thus not as popular as it might otherwise appear (ranking 9th out of 14 countries in peak-age participation rates).

To make things worse, American primary education does an abysmal job preparing its students for college. A 2009 OECD study ranked the United States 25th of 34 countries in math achievement, 14th in reading, and 17th in science. A more recent World Economic Forum study of 142 countries ranked American early education 37th and high school math and science education 50th. Is it any wonder, then, that a quarter of American college freshmen require remedial coursework, and at least half are unprepared for the increased difficulty?

I could offer damning statistics on the inadequacy of American education for days, but in the long run a concerted push for a better system may not even matter, because of the second reason preventing widespread gain from the high-level digital economy: A connected world encourages public interest and financial rewards to flow toward an ever-smaller group of elite individuals and businesses. This is in spite of the greater variety encouraged by connectivity and accessibility.

For example, the top five films in 1980 earned a quarter of all box office receipts, and three decades later, the top five films earned only 16% of all box office receipts. However, that's not really a fair comparison because almost five times as many films were released in 2010 as opened in 1980. In terms of the top five percent of all films released in each year, 2010's top grossers pulled in fully half of all receipts, while 1980's biggest hits earned only 27% of the total take.

The technology industry is no different. Activision Blizzard's (Nasdaq: ATVI) Modern Warfare 3 accounted for 8% of all Xbox games sold last year (out of over 800 possible games available). The Angry Birds franchise, despite representing less than a dozen of over 800,000 apps available on the major mobile platforms, is still responsible for at least 1.5% of all downloads ever recorded in the combined history of the App Store and the Android Market.

Apple earned $700,000 in profit for each of its 47,000 employees last year, and much of that was the result of the work of a much smaller group of engineers, programmers, and designers in Cupertino. By comparison, IBM (NYSE: IBM) , the world's largest technology company three decades ago, profited by only $25,800 per employee at that time, in real terms. These are just some of the numerous examples.

The pieces come together

We live in undeniably better times than those Luddites who first smashed the looms, but we also operate in an economic system that rewards production and marginalizes those unable to find work. Combine that reality with exponentially improving technology, and you have all you need to put capital on top of labor for good. When protests erupt over income disparity and studies reveal that the top 1% accounted for 93% of all income growth in 2010, the root cause isn't that the rich are greedy or that the poor don't work hard enough. In many cases, it's simply that those with capital can and will make use of labor-saving technology, and those whose labor is "saved" lose out.

For years investors have cheered streamlining and cost-cutting measures that trimmed payrolls and eliminated redundant jobs. But workers -- consumers, really -- are the economy. Henry Ford offered incomparable wages in Ford's (NYSE: F) early days, which both attracted top talent to out-innovate his competitors and also allowed more people to afford his cars. Many years later, his son allegedly had the following exchange with a union boss while touring a newly automated plant:

"Walter, how are you going to get these robots to pay your union dues?"

"Henry, how are you going to get them to buy your cars?"

Displacing the worker-consumer from any job, whether it's in a factory or at a desk, reduces their ability to contribute to the economy. Sitting back and glibly saying that they can become computer programmers and app designers and secondhand salesmen on eBay or Amazon.com ignores the trend that draws reward and prestige toward a smaller group of top talents, and the incredible pace of advancement that allows this shrinking group to control ever-larger portions of the economy through automation.

The computer industry, as measured by the Bureau of Labor Statistics, accounts for between 2% and 3% of all employment in the United States. Let's use Apple's methodology and expand the jobs that industry supports by ten times, to a full 30% of the workforce. That's a hair under 40 million people, which is about same as the number of iPads Apple sold last year. How will the rest of the country support itself? Where will the new frontier of jobs appear, and how can it possibly absorb the displaced? In the 18th century, factories took on displaced farmers. In the 1900's we had offices. This is the third great displacement, and it needs a solution of equal scope. Where is it going to come from?