Displays in a future iPad or iPhone could use lasers to detect touch, with the employment of VCSEL sensors and other technologies potentially monitoring the deflection of the screen to determine levels of 3D touch or to provide a more improved typing experience.

Current touchscreen technologies used in devices, like the iPhone and iPad among others, can use a wide array of techniques to detect when the user is touching the screen, or doing so with force. Existing methods include capacitive sensing, resistive sensing, ultrasonic sensing, and optical sensing.

With optical sensing in particular, the technology can be used to determine the deflection of the display from a finger press or pressure from a stylus. In a patent application published by the US Patent and Trademark Office on Thursday, Apple suggests it could improve upon existing methods.

In the filing titled "Self-Mixing Interference Based Sensors for Characterizing Touch Input," Apple proposes the use of vertical-cavity surface-emitting lasers (VCSELs) to monitor the display for touch inputs.

Each VCSEL emits a beam of "coherent light" towards the surface being monitored, which is reflected and mixed in with other coherent light beams at a variety of angles into other sensors. The number of angles is important, as the deflection of the display surface can alter the angle of reflection, which means the reception of specific light beams by other sensors can inform the system of how much deflection there is.

The system can be improved further still, by performing spectrum analysis on measured detections of multiple sensors to determine a speed of movement and the direction of movement. This can include reading multiple harmonic frequencies at different stages, which can also infer movement.

An example of VCSELs being used to determine a press and deflection of a surface

The measuring of the deflection is important, as it tells the system how hard the user is pressing down on a specific point of the display. While this has immediate applications in areas like 3D Touch, which used the amount of pressure on a display to determine what level of options to provide to the user, there are other ways it can be employed.

Software keyboards on touchscreens currently detect touch events, be they distinct pressing of specific areas of a display or swiping motion between keys as used in some versions. By detecting the level of deflection, it is feasible for a touchscreen keyboard to only accept the tapped input at a certain level of pressure, while disregarding lighter touches.

In theory this could allow the user to gently rest their fingers on the keyboard in a similar way to a physical keyboard for touch typing, safe in the knowledge their fingers won't trigger spurious key presses until they intentionally type and pass a pressure-based equivalent of a keyboard mechanism's "actuation" point.

Apple files numerous patent filings on a weekly basis, but while the applications do not necessarily result in the concepts being used in a future product or service, they do inform of areas of interest for Apple's research and development efforts.

VCSELs are already an important area of interest for Apple, with the technology used to power the iPhone's TrueDepth camera array, as well as showing promise in self-driving vehicle applications under "Project Titan."