Head-mounted wearable computers present a bit of an interface problem. Voice-based head-mounted systems impart the impression that a person is murmuring to him or herself, and accelerometer-based systems that rely on head movement make users look like they have a nervous tic.

One solution to the head-mounted-computer user interface conundrum involves hand gestures. Enter a new Google patent that seems to be the search giant's answer to controlling its Project Glass augmented reality system. Titled, "wearable marker for passive interaction," the patented system, which just went public Tuesday, would use a reflective infrared identifier placed on a user's hand to track and identify the user's gestures.

The IR identifier would be invisible to the human eye and could be placed on a ring or glove, or even affixed to a fingernail. (Whether the fingernail identifier would be bejeweled isn't defined in the patent's language." An IR camera integrated into an HMD (head mount display) would be used to track the IR image.

Using hand gesture patterns, the HMD would be controlled by a user's hand movements. For example, a certain gesture pattern could be used to launch an application or open a document.

In addition to interacting with a wearable system that looks suspiciously like Project Glass, the IR identifier could also be used to identify individual users. For example, the system could offer pre-determined, custom eyewear settings for each user: You put on your Google glasses, look at the IR identifier on your finger, and the system would activate your user pre-sets.

Of all the input systems that could be used to control Project Glass, hand gestures would seem to make the most sense. That is, if you're comfortable looking like you're conducting an orchestra while walking down the street.