Windows phones could soon learn a new trick from their Kinect cousins: the ability to see. Microsoft Research released a new video late last month that demonstrates the prototype "pre-touch sensing" system. It uses a self-capacitive touchscreen to recognize both the user's grip around the outer edge of the phone and their hovering fingers immediately above it. With that sensor data, the phone understands how you're holding it and can anticipate interactions based on its orientation, whether it's being held in one hand or two, the number of fingers involved and how far they are from the screen.