The wily geniuses at the Fraunhofer Institute in Germany have created the world’s first real-time emotion detection app for Google Glass. The app (glassware, as Google prefers to call it) can also accurately detect someone’s age or gender. All of the analysis is carried out on-board — the cloud isn’t used; the raw data (which might be sensitive in nature) never leaves the Glass device. Real-time emotion detection could be of great use for people with disorders such as autism, who often struggle to interpret facial expressions, or simply for people who struggle to divine their partner’s true emotional state when they say that they’re “fine.”

Fraunhofer’s Google Glass app is based on its tried-and-tested SHORE (Sophisticated High-speed Object Recognition Engine) system. SHORE started off as an object-detection computer vision system, but over the years it has developed into a face detection and fine facial analysis system. It can pick out a person’s face with a 91.5% success rate, and tell you that person’s gender 94.3% of the time. It can even take a stab at the person’s age. Previously, SHORE — which is essentially a highly optimized C++ library — has been deployed on various computer systems, from PCs to tablets. Now, Fraunhofer IIS has squeezed all of that facial analysis goodness onto Google Glass’s rather wimpy hardware (1GB of RAM, dual-core TI OMAP 4430 SoC). Watch the video below; it’s a little bit scary how accurate the system is at detecting gender, age, and emotional state.

Putting aside for a moment the awesome implications of being able to use technology to accurately detect someone’s emotions in real-time, there are obviously some privacy concerns to consider here. Fraunhofer stresses that a) the image data never leaves the device, and b) the Glass app can’t determine the identity of people that you look at. This is only marginally comforting: The app could do both of these things, but Fraunhofer tactfully chose not to. It is really only a matter of time until someone, if not Fraunhofer, goes the whole hog and enables facial recognition.

Read our featured story: Google Glass: A utopian dream, or dystopian nightmare?

While such a device would certainly have creepy implications, I think the positive uses would outweigh the negative. Imagine if, while walking around outside, a wearable computer and head-mounted camera was constantly analyzing the faces and facial expressions of everyone around you. It could warn you if someone — a potential mugger — is looking at you aggressively. It could tell you if your long-lost friend that you’ve been trying to do coffee with for months is on the other side of the street. It could tell you if the woman/man of your dreams is walking towards you, and that you should create some kind of Romantic Comedy-style diversion/distraction to insert yourself into their life. People who are bad at remembering faces or reading facial expressions (such as people with autism) could stand to gain a lot from such a device. You get the idea.

Yes, we shouldn’t just ignore the myriad privacy concerns — but with wide-scale facial detection software being rolled out by government agencies and private companies, though, it’s possible that the cat is already out of the bag. Yes, the constant erosion of privacy brought on by the internet, smartphones, and now wearable computers is a little bit daunting — but if police forces are going to automatically detect faces on a large scale to catch criminals, and commercial entities are going to use facial detection as the ultimate real-world “tracking cookie,” should we really be too concerned about Google Glass and other personal wearable computers doing the same thing?

Such discussions are probably moot, anyway. I think we all know that, as mobile technology gets ever smaller and more powerful, real-time facial detection and emotion sensing is inevitable — it’s just a matter of when, and which consumer-facing company gets there first.