This would primarily be helpful for ensuring that you get a quality charger, but scientists also see it as helpful when you have more than one device charging at a time. Instead of letting the pad decide which gadget gets all the electricity based on distance, the chip could slow down charging so that there's an even distribution of electricity. While this would naturally take longer, it beats having to plug in any secondary gear just to make sure it gets power.

The invention wouldn't make everyone happy. It's easy to see the chip being used as a licensing tool that forces charger makers to pay up if they want compatibility with a given phone. There's also the question of legacy support: how do you implement this without turning legions of wireless charging accessories into paperweights? MIT's work could go a long way toward discouraging cheaply made and knockoff chargers, but the days of ubiquitous hardware support might come to an end if the chip enters use without a broadly implemented standard in place.