Reprinted with permission from Wendell Potter, first published at the Center for Public Integrity.

By Wendell Potter

The reason health care costs are so high is because Americans don’t have nearly enough “skin in the game.”

That was the phrase that many of my former colleagues in the insurance industry and I began using in the early 2000s as a way to deflect attention away from us.

Americans — especially American employers — looked to private insurers to help control medical costs. But insurers were failing miserably, and some of them — Aetna in particular — were also failing Wall Street.

Thirteen years ago, investors and Wall Street financial analysts were not happy with the way some managed care companies were running their businesses. They felt that Aetna and other big for-profit insurers were spending far too much of their policyholders’ premiums paying claims. And they didn’t like it that insurers hadn’t been aggressive enough in getting rid of “unprofitable” customers.

One way to satisfy Wall Street was to begin shifting more and more of the cost of health care — and health insurance — to their customers. That meant that sick policyholders in particular would be paying more out of their own pocket for their care.

Our marketing folks came up with an almost Orwellian name for this cost shifting —consumer driven health care. In retrospect, it was a brilliant strategy, and one that got virtually no pushback from lawmakers or regulators. Little by little, year after year — and long before many people outside of Illinois had ever heard of Barack Obama — Americans began putting more of their skin in the health care game. They had no choice.

The strategy has been so successful that insurers are back in Wall Street’s good graces. Their profits keep breaking records, and so does the price of their stock.

But what’s good for them has been anything but good for a growing number of Americans. Out-of-pocket expenses have gotten so high that nearly half of American families don’t have enough money in the bank to pay their deductibles if they get really sick.

That was one of the findings of last week’s report from the Kaiser Family Foundation, which decided to look into how the skin-in-the-game strategy is affecting family budgets. KFF’s researchers found that 49 percent of American households wouldn’t have enough liquid assets to meet what their out-of-pocket obligations would be if they were in a plan with a $2,500 individual deductible and $5,000 family deductible.

While conventional wisdom holds that consumer driven health care has contributed to a slowing in the rate of medical inflation, it also undoubtedly has contributed to a very troubling phenomenon: people with health insurance who are no longer getting the care they need because they don’t have enough money to meet their deductibles.

“High deductibles may be okay for people who are generally healthy and have the resources to pay their cost sharing when they need to,” Kaiser Family Foundation CEO Drew Altman wrote in a Wall Street Journal commentary last Wednesday. “But big deductibles can also be a real barrier to needed care for people with moderate or lower incomes who are sick.”

The additional skin we’ve had to put in the game has been fairly modest on a year-to-year basis, so modest in fact that it hasn’t attracted much attention. But when you look back over the past decade, the cumulative increase is startling. Out-of-pocket costs have increased 100 percent or more in most states since 2003, according to The Commonwealth Fund, which also has been following this trend.

And while this has been going on, premiums have been going through the roof. The average premium for an employer-sponsored plan nearly tripled between 1999 and 2014, from $5,791 to $16,834. And lest you think Obamacare is to blame, some of the biggest annual increases occurred during the decade before the Affordable Care Act was passed.

Not only that, but a growing percentage of the premiums is coming out of worker’s paychecks. In 1999, the employee contribution to premiums for a family policy averaged 26.6 percent, according to KFF. It had risen to 29 percent by 2010, the year Congress passed the Affordable Care Act.

After the law went into effect, the percentage actually declined. It stood at 28.7 percent in 2014. Even with that modest decline, workers in 2014 nevertheless were paying more than three times as much for their employer-sponsored family coverage as they did in 1999 ($4,823 versus $1,543).

The first time I recall an insurance company executive use the term skin-in-the-game was in 2002 when then Aetna CEO John W. Rowe used it during a call with financial analysts and investors. The occasion was the company’s first quarter earnings report. Rowe described what he and his management team had been doing to turn the company around, which included getting rid of millions of members Aetna considered to be money losers.

Rowe said that to keep shareholders happy, Aetna would increase premiums 18 percent that year and shift more of the cost of care to its remaining members.

“’We are giving people some skin in the game, a personal financial interest in being cautious about seeking care,’” Rowe told the New York Times.

That was music to investors’ ears. Aetna’s shares rose 13.5 percent that day.

The stock price closed at $11.42 on April 26, 2002 (after adjusting for subsequent stock splits). Last Friday it closed at $104.09, an historical high. Thanks, in large part, to all that skin its policyholders have had to put in the game.

Wendell Potter is the author of Deadly Spin: An Insurance Company Insider Speaks Out on How Corporate PR is Killing Health Care and Deceiving Americans and Obamacare: What’s in It for Me? What Everyone Needs to Know About the Affordable Care Act.