Apple Pay may seem like the future of paying for stuff. Hold your smartphone up to small panel at the front of the store, press your finger to the screen, and walk out the door. But the future of paying for stuff was supposed to be even easier.

Check out the IBM video above, a blast from the year 2006. It shows a man brazenly shoving all sorts groceries—chips, meat, frozen dinners—into his trench coat. Other patrons see him doing it. So does the guy at the deli counter. But no one says a word. The punchline is that he wasn't stealing at all. When he walks out the door, a distant scanner takes inventory of everything he picked up, automatically bills him, and prints a receipt.

The video was part of a series produced by IBM to promote a wireless tracking technology called RFID, short for radio-frequency identification. RFID chips are used to track and identify everything from pets to sushi plates. You might also remember the controversy in 2005 when the U.S. State Department started requiring RFIP chips in all U.S. passports. It was the original Internet of Things.

Today, we use the term "Internet of Things" to refer to everything from wearable computers to self-driving cars. But in the early to mid 2000s, the concept was virtually synonymous with RFID. In fact, technologist Kevin Ashton claims to have coined the term "Internet of Things" in 1999 during a presentation on RFID at Procter and Gamble. IBM saw all sorts of promise in this new internet:

But the technology never took off as many expected it would. It never helped us pay for stuff as easily as that guy in the trench coat did. To be sure, RFID is still widely used in retail and shipping today. It's even at the heart of near field communication – the technology that powers Apple Pay and other contact-less payment systems. But the RFID revolution still hasn't arrived. Maybe Apple Pay will be different.

Correction 11/15/2014 2:15 PM EST: This article has been updated to make clear that Apple Pay actually uses RFID