Earlier this week [Aug. 26], the IDG News Service reported that the chip giant is developing a "depth sensing" camera, which was, in effect, an enhanced version of a 3-D camera and which, according to the company, “can bridge the gap between the real and virtual world.”

“Depth cameras are going to be used as an integral part of future man-machine interfaces,” says Iancu with reference to the new development, which has not yet been officially introduced to the public at large. “We have had quite a few achievements in this field. Thus, we have managed to incorporate into the device the capability of replacing an image background while shooting video. It is a depth camera that we are working on, which can detect objects in a far more precise manner than ever before.”

According to Iancu, the Intel Perceptual Computing Group focuses on developing intuitive computer interfaces that integrate users’ natural capabilities, like gestures, voice, and facial expressions. The development is carried on by a group of 150 engineers, in collaboration with engineers in the United States. President of Intel Israel Mooly Eden is responsible for overseeing all of the group's operations.

Intel Israel ’s R&D center in Haifa is in charge of developing a 3-D camera, which is to be launched next year. Igal Iancu, strategic planning director of the Intel Perceptual Computing Group, has confirmed to Calcalist that the device, which is designed to be incorporated into laptops and tablets and which has highly precise sensing and remote control capabilities, including the ability to decipher users’ moods, is under development by Intel.

Anil Nanduri, director of Perceptual Products & Solutions at Intel, told IDG that the camera would help computers better understand the intentions of their users, bring new levels of interactivity to games, and even identify users’ moods. “It will have the ability to sense excitement and emotions — whether the user is happy or smiling. The algorithms and technology are already there, but they are getting more technologically advanced and stable.”

In addition, the camera would be capable of identifying the distance, size, depth, color, contours and other characteristics of items in view. According to Nanduri, this development could have significant implications in the growing market of 3-D printing.

“You are not going to look for a case for a device anymore. You will just point that device at the camera, and the camera will recognize what you have. It will know the model number, and it will print the case for you, or you can go to the store and they will print it for you there.”

The camera would also be able to accurately track eye movements, a development which, according to Nanduri, has far-reaching educational implications. For instance, it would make it possible to pinpoint the difficulties children encounter when learning to read, by detecting the words they are stuck on and determining whether they need help with specific words.

Nanduri stressed that although the development may call to mind Microsoft's Kinect camera, the camera under development by Intel offers far more advanced capabilities. “Kinect was a good initial version of a depth-sensing camera, especially from a long-range perspective. However, when Intel started looking at it, we were primarily interested in personal interaction at a closer range, of up to 1 meter or a meter and a half,” Nanduri said.

And while there are already cameras in the market with similar capabilities, such as the depth camera developed by Creative in collaboration with Intel, the technology giant is developing a much smaller camera that may be incorporated into computers, the way webcams are currently integrated. “This is a major development challenge — introducing so many technologies into such a small space. However, that’s where Intel has the edge,” Iancu told Calcalist.

Using body movements to create content

Last July, Intel held an event for its employees in Qiryat Gat in southern Israel, where the company has two chip manufacturing plants. At the event, Eden presented the camera, featuring a live display of the camera in operation in a 3-D model of the solar system, displayed on a computer screen. In another presentation, a book was shown, which, when pointed at the camera, turned into an interactive game. Thus, in one instance, a butterfly was seen flying off the book leaves and landing next to the user.

About a month ago, Intel Israel invited external developers to attend a development event dedicated to the Perceptual Computing products the company is working on, including the camera described above. “All kinds of brilliant ideas were raised at the conference, which never occurred to us. Controlling a computer is one thing, thus it was suggested that body motions be used to create content, specifically music. There was someone there playing the piano ‘in the air,’ while someone else produced a virtual DJ experience," Iancu said. “It is one level above the regular interaction designed to replace the keyboard. It's fun and a lot more creative.”

According to Iancu, the camera is to be incorporated into laptops and tablets offered on the market as early as mid-2014 or the third quarter of next year.