Apple has filed a series of Mixed Reality Head Mounted Display system patent applications in 2019 that use a specific style of graphics (01, 02, 03, 04) and today the US Patent & Trademark Office published Apple's fifth such patent application that focuses on the display system using a variety of sensors. The sensors provide virtual content based in-part on the inputs from the sensors. The HMD may display the frames generated by the controller to provide a 3D virtual view including the virtual content and a view of the user's environment for viewing by the user.

Virtual reality (VR) allows users to experience and/or interact with an immersive artificial environment, such that the user feels as if they were physically in that environment. For example, virtual reality systems may display stereoscopic scenes to users in order to create an illusion of depth, and a computer may adjust the scene content in real-time to provide the illusion of the user moving within the scene. When the user views images through a virtual reality system, the user may thus feel as if they are moving within the scenes from a first-person point of view. Similarly, mixed reality (MR) combines computer generated information (referred to as virtual content) with real world images or a real world view to augment, or add content to, a user's view of the world. The simulated environments of virtual reality and/or the mixed environments of augmented reality may thus be utilized to provide an interactive user experience for multiple applications, such as applications that add virtual content to a real-time view of the viewer's environment, interacting with virtual training environments, gaming, remotely controlling drones or other mechanical systems, viewing digital media content, interacting with the Internet, or the like.

Apple's invention covers a mixed reality system that may include a mixed reality device such as a headset, helmet, goggles, or glasses (referred to herein as a head-mounted display (HMD)) that includes a projector mechanism for projecting or displaying frames including left and right images to a user's eyes to thus provide 3D virtual views to the user.

The 3D virtual views may include views of the user's environment augmented with virtual content (e.g., virtual objects, virtual tags, etc.). The mixed reality system may include world-facing sensors that collect information about the user's environment (e.g., video, depth information, lighting information, etc.), and user-facing sensors that collect information about the user (e.g., the user's expressions, eye movement, hand gestures, etc.).

The sensors provide the information as inputs to a controller of the mixed reality system. The controller may render frames including virtual content based at least in part on the inputs from the world and user sensors. The controller may be integrated in the HMD, or alternatively may be implemented at least in part by a device external to the HMD. The HMD may display the frames generated by the controller to provide a 3D virtual view including the virtual content and a view of the user's environment for viewing by the user.

In some embodiments, the sensors may include one or more cameras that capture high-quality views of the user's environment that may be used to provide the user with a virtual view of their real environment. In some embodiments, the sensors may include one or more sensors that capture depth or range information for the user's environment. In some embodiments, the sensors may include one or more sensors that may capture information about the user's position, orientation, and motion in the environment.

In some embodiments, the sensors may include one or more cameras that capture lighting information (e.g., direction, color, intensity) in the user's environment that may, for example, be used in rendering (e.g., coloring and/or lighting) content in the virtual view.

In some embodiments, the sensors may include one or more sensors that track position and movement of the user's eyes.

In some embodiments, the sensors may include one or more sensors that track position, movement, and gestures of the user's hands, fingers, and/or arms.

In some embodiments, the sensors may include one or more sensors that track expressions of the user's eyebrows/forehead.

In some embodiments, the sensors may include one or more sensors that track expressions of the user's mouth/jaw.

Apple's patent FIG. 1 below illustrates a mixed reality system #10 that may include a HMD #100 such as a headset, helmet, goggles, or glasses that may be worn by a user. In some embodiments, virtual content #110 may be displayed to the user in a 3D virtual view #102 via the HMD; different virtual objects may be displayed at different depths in the virtual space. In some embodiments, the virtual content may be overlaid on or composited in a view of the user environment with respect to the user's current line of sight that is provided by the HMD

The HMD may implement any of various types of virtual reality projection technologies. For example, HMD may be a near-eye VR system that projects left and right images on screens in front of the user eyes that are viewed by a subject, such as DLP (digital light processing), LCD (liquid crystal display) and LCoS (liquid crystal on silicon) technology VR systems.

As another example, the HMD may be a direct retinal projector system that scans left and right images, pixel by pixel, to the subject's eyes. To scan the images, left and right projectors generate beams that are directed to left and right reflective components (e.g., ellipsoid mirrors) located in front of the user eyes; the reflective components reflect the beams to the user's eyes. To create a three-dimensional (3D) effect, the virtual content at different depths or distances in the 3D virtual view are shifted left or right in the two images as a function of the triangulation of distance, with nearer objects shifted more than more distant objects.

Apple's HMD may include world sensors #140 that collect information about the user's environment (video, depth information, lighting information, etc.), and user sensors #150 that collect information about the user (e.g., the user's expressions, eye movement, hand gestures, etc.).

Either of the sensors may provide the collected information to a controller of the mixed reality system. The controller may render frames for display by a projector component of the HMD that include virtual content based at least in part on the various information obtained from the multiple sensors.

While not shown in FIG. 1, in some embodiments the mixed reality (MR) system may include one or more other components. For example, the system may include a cursor control device (something like a mouse) for moving a virtual cursor in the 3D virtual view to interact with virtual content.

As another example, in some embodiments, the MR system may include a computing device coupled to the HMD via a wired or wireless (e.g., Bluetooth) connection that implements at least some of the functionality of the HMD, for example rendering images and image content to be displayed in the 3D virtual view by the HMD.

Apple's patent FIGS. 2A through 2C below illustrate world-facing and user-facing sensors of an example HMD. Specifically, FIG. 2A shows a side view of an example HMD with world and user sensors 210-217.

Apple's patent FIG. 2B above shows a front (world-facing) view of an example HMD0 with world and user sensors 210-217; FIG. 2C shows a rear (user-facing) view of an example HMD with world and user sensors 210-217.

Apple's patent FIG. 4 below is a block diagram illustrating components of an example mixed reality system. In some embodiments, a mixed reality system 1900 may include a HMD 2000 such as a headset, helmet, goggles, or glasses. HMD 2000 may implement any of various types of virtual reality projector technologies. For example, the HMD 2000 may include a near-eye VR projector that projects frames including left and right images on screens that are viewed by a user, such as DLP (digital light processing), LCD (liquid crystal display) and LCoS (liquid crystal on silicon) technology projectors. As another example, the HMD 2000 may include a direct retinal projector that scans frames including left and right images, pixel by pixel, directly to the user's eyes. To create a three-dimensional (3D) effect in 3D virtual view 2002, objects at different depths or distances in the two images are shifted left or right as a function of the triangulation of distance, with nearer objects shifted more than more distant objects.

Apple's patent FIG. 4 below is a block diagram illustrating components of an example mixed reality system.

Apple's patent application that was published today by the U.S. Patent Office was originally filed in Q1 2019. Some of the work dates back to 2016. Apple has filed 20 patents relating a mixed reality headset and smartglasses in 2019 along. You could review our archives here. Considering that this is a patent application, the timing of such a product to market is unknown at this time.

Apple's Inventors

Ricardo Motta: Distinguished Engineer – DEST. Came to Apple via NVIDIA

Brett Miller: Engineering Manage, Camera Incubation. Came to Apple via Intel's Perceptual Computing Lab

Tobias Rick: Video Engineering. Came to Apple via Samsung Next Experience Display Lab

Manohar Srikanth: Camera & Imaging Technologist. Came to Apple via Nokia as Senior Researcher

About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Those using abusive language or negative behavior will result in being blacklisted on Disqus.