Stay on Top of Enterprise Technology Trends Get updates impacting your industry from our GigaOm Research Community

It’s time those front-facing cameras on smartphones were used for more than just video chatting or taking low-resolution pictures: Pantech is adding software in its phones that will allow users to control the handset with gestures. The concept is similar to Microsoft’s Xbox Kinect but on a smaller scale.

Pantech’s new Vega LTE smartphones will be the first products to gain the technology, which is powered by eyeSight, an Israeli startup we highlighted in June of 2010. Using the eyeSight solution, handset owners will be able to use a gesture to answer a call, start music playback, control games and more. The idea is that the gesture interaction is useful when the smartphone is in a hands-free mode such as driving, cooking or when the phone is sitting in a dock.

Last year, when we first heard of eyeSight, the company’s founder and CEO, Itay Katz, said this in the news release: “Users are looking for ways to ease, improve and enjoy their day-to-day interaction with their mobile phone, ideally aiming to gain effortless control of the device’s applications and functions, which is where eyeSight’s solution comes to place.”

While I’m not sold that the general public will take to gestures on smartphones like it has with the Xbox Kinect, I agree with Katz’s sentiment about users looking for improved device interaction. For those who prefer a smartphone in a hands-free use case, eyeSight’s solution should bring that improvement. But the smartphone is a device that is generally meant to be held. As such, I think there’s a limited audience for gesture controls on a handset.

Still, the news that eyeSight’s solution getting picked up by a hardware manufacturer — Pantech is one of Korea’s top three handset makers — shows that some are willing to think outside the box when it comes to user interfaces on mobile devices. And it’s not the only solution either: Take a look (or a listen?) at Siri on Apple’s iPhone 4S and you’ll see another example of a user interaction improvement.

Touch is still a key user interface on mobile devices, but with all the sensors in our smartphones, expect to see more of these forward-thinking ideas as devices begin to adopt what I call the “invisible interface.”