What's really impressive is how conceptually elegant this setup is. The whole thing is made possible by a Synaptic CMOS sensor -- which is basically a tiny camera -- laminated to the back of an OLED panel. (LCDs are a no-go because of their backlights.) The light from the OLED display itself illuminates your fingerprint, which the sensor "sees" and checks against the print stored on the phone. Of course, none of this will matter to the people who actually try it. Aside from the fact that you can't actually see the sensor, it works like you'd expect it to.

That's not to say it worked all the time though -- at first the sensor often didn't recognize my finger. This phone definitely isn't ready to go on sale, so it's hard to say if my frequent failed attempts were because of non-final software or a sensor that didn't work well. Thankfully, whatever the issue was, it seemed to resolve itself.

Now, I'm taken with how clever this approach is, but it also has some potentially fundamental pain points. For one, people screw up their screens all the time. A crack running across the display or a few well-placed scratches could mean what the sensor sees no longer matches the stored image of your fingerprint. Then again, it seems likely that you could log in another way and re-register your finger to make that stored print the new normal. (A Synaptics rep stopped short of confirming this outright.)

More importantly, the sensor might not recognize your fingerprint if it's distorted -- say, if you're pressing your finger onto the screen harder than you were when you registered it. Needless to say, this is going to take a lot of getting used to. Hopefully, it won't be too long before we get to try a sensor like this in a more polished phone.

Click here to catch up on the latest news from CES 2018.