Emotient, a startup based out of San Diego that works in the emerging area of facial expression recognition, is today announcing a $6 million round of funding and its first steps into applying its technology in the wearables market: a new piece of “glassware” for Google Glass that measures sentiment analysis based on reading people through the headgear’s camera.

Longer term, the aim is the become the sentiment analytics engine for “any connected device with a camera,” the company notes, with a SaaS model based around its API a fundamental part of that strategy.

The Series B round of funding was led by Seth Neiman, formerly a general partner at Crosspoint Venture Partners and now leading new VC firm Handbag. Previous investor Intel Capital also participated — bringing the total raised to date to $8 million since the startup was first founded in 2012.

Emotient says that the funds will be used to “broadly commercialise” its technology by way of its Emotient API, along with specific products for verticals like retail and healthcare. It competes against others like KPCB/Horizon Ventures-backed Affectiva, which has focused on the worlds of marketing and ad-tech, among other areas.

Emotient says it is distinct from Affectiva and because of how its data is delivered. “We believe our technology is differentiated in its ability to deliver sentiment and emotional insights in real-time and in its accuracy in uncontrolled environments, such as a crowded store,” spokesperson Vikki Herrera told me.

Today Emotient is unveiling details of one of the first applications: a Google Glass app — known as “glassware” — that it will be targeting first to the retail segment: salespeople who wear Glass can use it to measure how customers respond during their interactions and then get feedback that can help tailor their responses — particularly aimed at training for future situations, but also for real-time feedback.

“All good business leaders know ‘you get what you measure’, and being able to objectively and accurately monitor customer sentiment allows retail teams to build plans and tactics to win,” said Ken Denman, CEO, Emotient, in a statement. “The ability to measure real-time customer sentiment, as it relates to customer service, products and merchandising is a huge opportunity for businesses to drive focus and therefore sales.”

Although retail is the first use case, you can’t help but think that this might be relevant for all Google Glass wearers. Google has made a point of outlining the etiquette of how not to be a Glasshole, although clearly some “Explorers” haven’t been able to read normals’ responses very well.

Neiman, who is taking a board seat with his investment, describes Emotient’s co-founders — they include Marian Bartlett, Ph.D.; Ian R. Fasel, Ph.D.; and Javier R. Movellan, Ph.D. — a pioneers in the field, and says that his interest comes from the fact that facial recognition is fast evolving, with more sophisticated demands for how it is used.

“We believe the next phase will extend from physiology to psychology,” he says in a statement.

Emotient’s technology is based around detecting and tracking the seven expressions of primary emotion — joy, surprise, sadness, anger, fear disgust and contempt– as well as overall sentiments like positive, negative, and neutral.

It tracks some 19 different basic muscle movements to also suss out advanced emotions like frustration and confusion. It’s an example of how machine learning and AI specialists are working to create increasingly human-like technology.

Glass is Emotient’s first foray into wearables, although it has successfully ported its technology into Android and other tablets.

It will be going into other places. Intel is a strategic investor and is therefore also a customer. It will be incorporating Emotient’s libraries into the next version of their RealSense, formerly Perceptual Computing, SDK, which will opens the door for a broad spectrum of developers to access a basic version of our API and emotion-enable a wide variety of apps and services.