Your sense of touch could be the next frontier in relaying valuable contextual information if new research currently being conducted at MIT proves successful. Researchers believe it may be possible to design wearable arrays of GPS-enabled vibration motors that provide simple navigational cues or detailed data through a kind of tactile Morse code. This could lead to non-visual haptic display technology — why not check your email without even opening your eyes?

Your skin and eyes process very different kinds of sensory data, but the number of receptors is almost identical. The simple fact that you have touch receptors over about two square meters of skin makes it an ideal route to convey information. The question being investigated by MIT senior researcher Lynette Jones is, where should the input be directed?

While all skin is capable of detecting tactile interactions, not all of it is equally sensitive. Just like screen resolution for visual data, the larger the array of vibration motors, the more detailed the haptic data stream can be.

To determine how well people can identify touch input, Jones has designed and built a pair of wearable devices. The first consists of eight accelerometers tied into a pancake vibration motor of the same sort used in cell phones. This allowed researchers to gauge how far vibrations propagate through skin. They found that the vibration was not detectable 8mm from the motor.

The second wearable device consisted of a 3×3 array of vibration motors. This one was used to see how able people are to discern where a vibration is coming from. Even though the action of the motors was not detectable from the outside past 8mm, participants perceived the vibrations at about 24mm. This makes it harder to identify the source of vibration when the motors are less than a few centimeters apart.

Another barrier to overcome is the dampening effect of the skin in different areas of the body. The intensity of the vibration might need to be varied based on where the device is worn to attain the desired “display resolution.” Jones and her team found that looser skin tends to dampen motors more, but this could potentially allow for smaller arrays with less interference between motors.

If the research pans out, it could lead to a new way of absorbing data — possibly in the same way blind individuals read braille lettering. Wearable grids of small vibration motors could act as simple left/right navigation indicators, or notify you of new messages. With additional environmental data, the same technology could help emergency personnel find their way around a burning building, or send messages back and forth when looking at a visual screen would be too dangerous.

The MIT team is currently designing arrays that can be worn across the back, and a smaller version that wraps around the wrist. Coupling these devices with wireless technologies could even allow for integration with existing smartphone navigation and messaging services.

Now read: Vibrating carbon nanotubes could give us an atomic-scale MRI