Over the last year or so in the U.S., many of the plastic credit cards we carry around every day have been replaced by new one with chips embedded in them. The chips are supposed to make your credit and debit cards more secure—a good thing!—but there’s one little secret no one wants to admit:

The U.S. transition to chip cards has been an utter disaster. They’re confusing to use, painstakingly slow, less secure than the alternatives and aren’t even the best solution for consumers.

If you’ve shopped in a store and used a credit card, you’ve noticed the change. Retailers have likely asked you to insert the chip into the card reader, instead of swiping. But reading the chip seems to take much longer than just swiping. And on top of that, even though many retailers now have chip reading machines, some of them ask us just the opposite–they say not to insert the card, and just swipe. It seems like there’s no rhyme or reason to the whole thing.

Most people have the same question: why are chip cards so bad?

When trying to figure out who’s to blame, you end up with weirdly tangled web of misaligned incentives. Almost everyone involved—banks, credit card companies, retailers and merchants, payment processors, terminal manufacturers—have been focused on their own bottom lines, rather than the impact their decisions will have on customers. And that’s created a maelstrom of incompetence.

All of this started when the U.S. decided to move to the chip standard—known in the industry as EMV. The U.S. process was different than those of other countries, where governments instituted a mandate to upgrade everything by a certain date.

The U.S. implemented something called a “liability shift”—essentially, if a retailer didn’t support chip card payments by buying a new, expensive machine, it’d be held accountable for any sort of fraud that occurs in their store. Usually, that’s the bank’s responsibility.

So, as long as retailers purchased the new chip-card reading terminals, liability shifts back to the bank. In a July report on the chip card transition in the U.S., the Aite Group, a financial services research firm, cited a lack of mandate in the U.S. as one reason the chip card transition has been so confusing.

The liability shift date in the U.S. was Oct. 1, 2015. But, when the date actually rolled around, shoppers were hard pressed to find a chain retailer that actually supported chip cards, let alone a mom-and-pop shop. In a letter from industry trade group Food Marketing Institute asking credit card companies to postpone the liability shift, the group wrote as of April 2015, retailers were experiencing 4-month delays just waiting for their new terminals to arrive.

And just because shops finally got new terminals didn’t mean they’d immediately start accepting chip cards. Their payment processors needed to certify their systems are still compliant and working correctly before the chips readers could be turned on. Even in 2016, they can only do that by physical inspection. That process can drag out for weeks, and some bigger retailers were still verifying their terminals as of early 2016, according to sources that spoke to Quartz.

The idea of the liability shift was to incentivize retailers to upgrade their equipment without forcing them to. And to a degree, that worked—for big retailers, like WalMart, Target and Home Depot. Magnetic stripe cards were causing them massive fraud—millions of Americans saw their personal information stolen by hacks at Target and Home Depot. Suddenly, being held financially responsible for that fraud could cost those stores millions.

But what about mom-and-pop shops, or coffee shops like Starbucks?

Because fraud isn’t as big of a deal at smaller stores, there’s less of an incentive for those retailers to spend tens of thousands of dollars to upgrade. The cost of replacing their equipment has been the primary reason many small business owners haven’t supported chip cards, based on dozens of conversations with store owners.

There are also the logistical nightmares for merchants that accept chip cards. For example, have any bars or restaurants you’ve been to in the U..S asked you to pay with a chip card? Probably not. That’s because bar and restaurant owners are worried about the backend issues with supporting chip cards, according to many who spoke to Quartz.

Some are worried about a different tipping process. Others, specifically bars, are more concerned about keeping track of bar tabs. Despite being a new technology, the software for chip cards doesn’t allow bars to keep a tab going, so they’d have to track spending the old-fashioned way: with pen and paper. That can get hectic when swarms of people are opening and closing bar tabs Friday and Saturday nights.

All of these issues—and a plethora of other regulatory, technical, and software problems—have made the chip card rollout in the U.S. a slow-motion trainwreck.

A lot of these problems could have been prevented. Bar owners who I spoke to said they’ve known about the bar tab issues long before the liability shift. Payment experts have been openly complaining about the lack of education on how and when customers should use chip cards.

The Aite Group notes in a report from June 2014 that banks and retailers in Australia ran full page newspaper ads explaining the benefits of chip cards and how to use them when the country added EMV. Maybe everyone should have been more proactive about the education process years ago, instead of waiting until the last minute.

And things have gotten even weirder, post rollout.

Back in late Oct. 2015, the FBI criticized the use of chip-and-signature, which is how U.S. chip cards have been deployed, saying that chip-and-PIN, the system used in the rest of the world, was safer. Yet, despite the FBI’s protests, nothing changed.

Critics have told me banks opted for a signature versus a PIN code because it saves them large amounts of money by not having to store PIN codes for everyone. Banks, on the other hand, say they feared their customers would have a difficult time remembering a four digit code. (This, despite millions of people already using check cards and other access cards with PIN codes.)

But the simple fact is that with chip-and-signature, banks created a new, less secure way to pay—when a more secure version was available. The cost of extra security, to be fair, weren’t inconsequential—the Aite Group estimates supporting chip-and-PIN would cost banks $1 billion, and merchants another $4 billion. But that cost is divided among dozens of banks and thousands of businesses. And it would’ve led to increased security for consumers.

The whole chip card transition was timed to begin as the holiday shopping season of 2015 got underway. Contrary to expectations, things actually went pretty smoothly at first. Even if checkout times were longer because of chip cards, lines weren’t noticeably longer. That puzzled a few payments experts I spoke to around that time—the consensus was processing chip cards took 10 seconds longer than swiping a card, at least. So why weren’t lines longer?

According to Forbes, CVS simply shut off the chip card part of its terminals during the holiday season, to avoid the inevitable long lines. And CVS probably wasn’t the only retailer to do that. So, to rehash, the solution for longer lines wasn’t to make checkouts faster but to completely bypass the new security feature during the busiest shopping season of the year.

Smart.

At least credit card companies are owning up to this… right? Visa and MasterCard tacitly admitted to the rollout flaws when the two companies announced changes to make chip card payments more tolerable. Among them includes: fixing the certification process and deploying more resources to develop better terminal software.

But don’t expect anything to get better soon. Mobile wallets like Apple and Android Pay can help cut down on the checkout time, as they’re a lot faster than chip cards. But for the less digitally inclined, plastic cards and those tiny metal chips will probably still be pretty cumbersome for the foreseeable future.

If anything, the chip card transition in the U.S. proved one thing—making infrastructure changes to the U.S. economy is incredibly complex and takes much longer than even the experts would like to think.