When thinking about Augmented Reality, we tend to think of live imagery being added to our iPhone photos or videos or that could be added to smartglasses in the future like Apple presents on the AR related webpage. But Apple engineers have come up with a way to assist users with a specific vision problem, mainly consisting of a permanent blind spot in a user's field of vision, overcome their impairment when taking photos with a future iPhone.

Apple's invention isn't for general use, such as in a pair of prescription lenses yet. For the time being, the invention and application is limited to assisting users with retina problems take photos with an iPhone without their blind spot being a handicap any longer. That would be quite an achievement if Apple could bring this to market. It could assist millions of users with this common handicap worldwide.

Yesterday Apple was granted patent # 10,347,050 that covers the field of digital image processing, and more specifically to the field of warping images in an augmented reality device.

Many types of visual impairment may result in partial loss of sight, or weakened sight. Visual impairments may be the result of any number of health issues, such as diabetes, old age, retina issues, and the like.

As an example, some people develop floaters or blind spots in their eye which cause obstructions in a person's visual field. The result is that people often have to compensate for blind spots and other obstructions and impairments by viewing only portions of a scene at a time. Thus, the full field of view is never made available at once.

Apple's granted patent covers systems, methods, and computer readable media for image warping. In general, techniques are disclosed to provide a warped image of a real environment. According to one or more embodiments, providing a warped image may allow a person with partially impaired or occluded vision the ability to see a full field of view by warping the field of view occluded by an abnormality in the eye.

According to one or more embodiments, a camera may capture an image of a view of a real environment. A portion of the view of the real environment may be determined to be occluded to a user. For example, a user may have an optical obstruction, such as a blind spot or a floater that makes a portion of the field of view occluded.

As an example, a portion of the real environment may appear blurry, or not appear at all, based on the optical obstruction. A warping function may be applied to the image surrounding the obstructed portion.

In one or more embodiments, the warping function may warp the area surrounding the obstruction such that the real environment appears warped around the obstruction. Thus, when the warped area is displayed, the user can view the full field of view regardless of the optical obstruction.

Apple's patent FIG. 1 below illustrates an overview block diagram of a simplified electronic device with a warping module; FIG. 2 shows, in flow chart form, an example method for augmenting an image of a real environment; FIG. 4 shows, in system diagram form, an example setup of using a device to warp images of a real world environment; FIG. 5 shows an example system diagram of an augmented reality device warping an image to allow the user with partially impaired vision see around a blind spot.

The user's blind spot as noted below partially covers the number 5 and totally blocks the number 6. With the camera's AR warping feature, the user will be able to clearly see numbers 5 & 6 that were blocked with the image warping to place images blocked to outside the blocked area.

So how does this work? According to Apple, an iPhone (iDevice) camera would include a calibration module used to perform a check to determine the location of the blind spot, floater, or the like, in a user's eye.

Identifying an optical obstruction may be done in any number of ways. For example, the calibration module may display a test image on the iPhone display and request and/or receive feedback from a user indicating whether the test image is visible at certain positions images so that future captured images by the front-facing camera appear correctly on the display. Sensors are used to assist in locating a user's blind spot area.

Apple notes that the camera's tracking module may track the object in part by determining a depth of the object in relation to one or more of the cameras capturing images of the object. That is, calibration module may implement eye tracking and/or gaze detection technology.

In one or more embodiments, the pose of a user's eye and the relationship of the eye to the iPhone may be determined using the back-facing camera to capture an image of the user's eye.

The camera capturing the image of the user's eye may be a traditional camera, including a lens stack and sensor, or may be a lenseless camera. In one or more embodiments, the calibration module may additionally, or alternatively, utilize other sensor data from sensors capable of tracking movement and position of a user's eye.

Apple's granted patent 10,347,050 that was discovered by Patently Apple first, was originally filed in Q3 2017 and published yesterday by USPTO. You could dive into this further, here.

The Inventors

Ray Chang: Sr. Systems Hardware Engineering Manager. Some of his teams accomplishments include Apple Pencil, Force Touch trackpad, Siri remote and more.

Paul Wang: Architect, Product Design. In his 6 years with Apple he's held position of Mac Product Designer; Sr. Input Technologist; Manager Product Design and now Architect, Product Design.

About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Those using abusive language or negative behavior will result in being blacklisted on Disqus.