A new patent application from Apple describes tech that combines current 3D simulation for handheld devices with eye-tracking techniques for a new kind of interface.

Apple has filed a patent application for a 3D eye-tracking graphical user interface (GUI) for personal electronic devices like the iPhone and iPad.

The application, published Friday by the U.S. Patent & Trademark Office, describes technology that could be incorporated in the company's iOS mobile operating system for use with gaming, photography, video, biometrics, and surveillance applications, according to the Patently Apple blog, which spotted the filing.

Apple's proposed technology essentially uses various techniques to combine aspects of current 3D simulation for handheld devices with facial-recognition technologies, particularly eye-tracking, to create a more reliable and realistic 3D interface for users.

Current 3D simulation for gaming and other applications on handheld devices uses data from various instruments like accelerometers and gyrometers that are built into the devices to recreate "six-axis" positional information, the blog notes, while facial-recognition software has also been improved upon greatly in recent years for biometrics and other purposes.

It appears that the new Apple technology combines those two advancing fields of development, Patently Apple reports.

"[C]urrent systems do not take into account the location and position of the device on which the virtual 3D environment is being rendered in addition to the location and position of the user of the device, as well as the physical and lighting properties of the user's environment, in order to render a more interesting and visually pleasing interactive virtual 3D environment on the device's display," the blog's Jack Purcher writes.

The patent application describes the use of sensors to determine the position of the device in conjunction with a front-facing camera. Associated software would track and calculate the position of the user's eyes relative to the device, and also gauge the lighting environment to better adjust the output for a potentially much more realistic-looking 3D interface.

The upshot is that we could someday have iPhones and iPads that can sense via instruments, ray-tracing, and software where they are relative to our eyes and provide us with such effects as object shines and shadows that move realistically in tandem with the movement of our eyesin short, creating a 3D virtual reality that would leave the current stuff in the dust.

For a hint of what Apple's new technology could do, see the video below.