Magic Leap Review Part 1 – The Terrible View Through Diffraction Gratings

I have been studying the Magic Leap One (ML1) for a few weeks and have taken over 1,000 photos and made many measurements. I’m bursting with things I want to talk about, so I am planning to release a series of articles. For this first article, I am primarily going to cover the view of the real world through the ML1.

The ML1 uses Diffraction Waveguides which in turn mean you must look at the real world through a diffraction grating. No amount of money and effort can change the laws of physics (or “Moore’s Law”). Looking through a diffraction grating is going to distort the view of the real world as this article will demonstrate. One must ask, “What good is Augmented Reality if it is ruining the view of the real world?”

As I started to write up my findings, I learned that Magic Leap was making a serious effort to market diffractive waveguides to the U.S. military for use by soldiers in the field (see disclaimer at the end). Until I heard about this effort, I had no idea that Magic Leap would have the audacity to push technology so ill-suited for outdoor military use.

A Little on Receiving and Setting Up My ML1

Before I go into the issues with the ML1, let me say a few good things about the ML1. First, I had a problem with my American Express Card denying the transaction due to suspected fraud (maybe someone at Amex had been reading my blog, just kidding). Magic Leap’s customer service via email helped me fix this issue.

Magic Leap is using the delivery and set-up service Enjoy. The service technician was helpful and spent about 1 hour getting me set up. But as others have pointed out, the use of a delivery and setup service is also an indication that the ML1 is not nearly ready to be a consumer product. Also because it requires custom fitting and you can’t wear regular glasses with it (unlike Hololens), sharing the headset with others is highly problematic.

Unlike other reports, I was not given the shoulder strap carrier (what some are calling “The Magic Leap purse”). I don’t know if this was an availability issue or if the Enjoy representative forgot to give it to me.

The color balance in the center of the view looks good, but the colors shift as you move away from the center of the image (more on this in an upcoming article). But even compared to a cheap LCD monitor, the image quality regarding uniformity, sharpness/resolution, and contrast is not very good.

The demos, particularly the Construct applications, were fun to play with and were captivating. It does have a visceral impact to see objects moving around in a 3-D world as it was with Hololens, but there is more to do out of the box with the ML1. With the FOV being wider than Hololens, it makes for a better first experience.

The hand controller with the ML1 is much better than the use of gestures with Hololens which quickly tire one’s arm. Still, it is a pain to do even simple tasks with both the ML1 controller and Hololens’ gesture inputs. There is a lack of consistency in the way you control the ML1 from application to application, particularly when it comes to closing an application or window (it seems like every application has a different closing method).

Some of the Major Problems

I don’t have time right now to go into all the details, but I would like to briefly list down some of the issues I have found to date:

It blocks 85% of the real world light (as measured and as predicted in my analysis of the Shaq video back in February 2018) Blocking so much light helps to hide or reduce other optical problems Hololens blocks about 60% and thus lets through 2.67X (40%/15%) more light than the ML1

The ML1 emits only ~210 cd/m 2 (commonly referred to as “nits”) compare to Hololens which outputs ~320 nits. Even with all the computer hardware moved to the “Lightpack,” the headset gets very hot, so much so that the Envoy agent advised me to be careful about having the hottest parts touch my head

(commonly referred to as “nits”) compare to Hololens which outputs ~320 nits. Looking through a diffraction grating blurs and distorts the real world (the subject of this article)

The diffractive waveguides soften/blur the virtual image (the image does not make it through the waveguide) The effective resolution of the ML1 is about half that of the 1280×960 pixels Magic Leap has claimed The ML1 displayed image is noticeably blurrier than Hololens (a subject for next the next post)

Diffraction waveguide inherent color issues Color ripples (quoting Ryan Smith, CEO of Invrse), “like looking through soap bubbles.” Colors shifting across the FOV particularly on about the outer 15% of the left and right side of the FOV

The dual Focus Planes for vergence-accommodation conflict (VAC) are ridiculous (this will have to be a whole article) There are only two, they show only one at a time, and the user sees it jump when it changes People claiming that it “must be working because you don’t notice it” are a victim of marketing hype As Palmer Lucky, founder of Oculus, wrote, “Mismatch occurs at all other depths. In much the same way, a broken clock displays the correct time twice a day.“ The “near focus plane” kicks in from the clipping point of 14.5 inches to about 30-inches and is focused at ~20 inches (50.8 cm) The “far focus plane” anything beyond 30-inches (there is some hysteresis as you move in and out) appears to focus at (only) ~60 inches (1.54 m) Magic Leap appears to have doubled down on the psychobabble in their blog article “ Spatial Computing: An Overview for Our Techie Friends ” — Nauseating, in more ways than one IMO. Magic Leap did add one other “optical trick that they call “sub-pupils,” but it is not clear how well it works. For reference, sub-pupil gratings are described in Magic Leap Applications US20160327789 and US20180052277 and seen in Step 10 of the iFixit Teardown)

Problem with Binocular Overlap – With an image that fills the FOV, there are dark bands at the left and right side (the size of the bands depends on the eye’s vergence) Eliminating this problem would mean reducing the FOV

It uses field sequential color ( FSC ) but with at least twice the sequence rate of Hololens so the FSC breakup much less noticeable but still an issue for some applications.

) but with at least twice the sequence rate of Hololens so the FSC breakup much less noticeable but still an issue for some applications. Very little eye relief and requiring separate purchase of special corrective lenses

The Simultaneous Localization and Mapping ( SLAM ) technology is just so-so. It requires just the right amount of lighting and is blind to anything dark in color. The mapping is not particularly accurate and seems to drift. I’m told by others that have much more experience in this area that it is inferior to the Hololens’ SLAM .

) technology is just so-so. It requires just the right amount of lighting and is blind to anything dark in color. The mapping is not particularly accurate and seems to drift. . The cabling is a definite snag hazard and in your way. One of the biggest dangers is that you will take off the headset, place it on the table and forget that the computer unit is still attached to your pocket or shoulder strap, then and as you turn around or walk away you will drag the headset off the table.

Dim and Fuzzy View of the Real World

Starting at the beginning of my evaluation, I took a picture looking through the ML1 with the unit turned off. The first thing that is obvious is that the ML1 blocks most of the real-world light and block much of a person’s peripheral vision. Back in February 2018 based the video with Shaq, I was able to estimate that the ML1 was blocking about 85% of the light. This estimate agrees with instrument measurements that it blocks about 83% of the light at the top and 86% at the bottom of the waveguide. You will also notice in the picture on the left the effects of looking through a diffraction grating.

By way of comparison, Microsoft’s Hololens blocks ~60% of the light which is significant. Still, Hololens lets through ~2.67 times more real-world light than the ML1.

Quick Background on Diffraction Waveguides

Diffraction Waveguides have been discussed on this blog many times including ones with Hololens and Magic Leap. A series of lines spaced apart close wavelength of light, a diffraction grating, will bend light like a prism. But unlike a prism, a grating will bend the light in a series of “orders.” With a diffractive waveguide, only the light from one of these orders is used, and the rest of the light is not only wasted, but it can reduce the contrast of the overall system as it bounces around in the optics. The angle and spacing of the orders that the diffraction bends the light is a function of the wavelength (color) of the light, the grating spacing, and the angle at which light hits the grating.

Key to today’s topic is that with a diffractive waveguide you are looking at the real world through a diffraction grating, what is known as the “exit grating and pupil expander.” In the case of a diffraction waveguide, the light is traveling through the glass with total internal reflection at an angle of about 45 degrees (the exact angle depends on many factors). The exit grating is designed to bend the light from ~45 degrees to 90 degrees so it will exit the waveguide’s glass toward your eye.

But light from the real world is passing through the diffraction grating from the opposite side with all manners of wavelengths and directions causing this light to be bent at various angles. The exit grating will both diffract/defocus real-world light that would have otherwise gone straight into the eye and causes prism-like color artifacts. In the cases of the ML1 and Hololens, the exit gratings run horizontal and thus make them susceptible to “capturing” light from above, such as overhead lights.

Worse yet for the ML1, you are looking through six (6) diffraction gratings. While each grating level is designed to bend the light of a specific color, the wavelengths of visible light are close enough together than the gratings affect all visible light.

Real World Viewed Through ML1 and Hololens Exit Gratings

The pictures below were taken through an ML1 (left) and Hololens (right) “exit gratings” with the units off. I adjusted the exposure to compensate for the ML1 blocking ~2.7x the light. In the two pictures below, you can see how the diffractive waveguides break the light from the lamp above into a rainbow of colors in the middle of the pictures. Additionally, in the ML1 picture on the left, you should notice the fuzzy, flame-like, double image just above the light fixture’s bulbs. The flare is caused by light from the lamp going through the six layers of diffractions gratings roughly perpendicular to the waveguide. While this effect is barely noticeable with the Hololens (it is still there), it is very pronounced with the ML1.

The ML1 has a less noticeable effect the light grabbed when the light source is within the FOV (see above) when compared to Hololens, but the ML1 has a much bigger problem than Hololens when the light source is above the FOV as demonstrated in the two pictures below. With the ML1, I regularly see “flares” of colored light in the bottom of the view when there is an overhead light source. When viewed through both eyes (as opposed to one lens in the pictures) you will see double the “flares.” BTW, in the pictures, below, the Hololens headset is visible in the picture taken through the ML1 and vice versa.

Remember also that Magic Leap is blocking 85% and Hololens is blocking 60% of the real world’s light. So, these diffraction effects will be much brighter if they didn’t have the dark lenses.

As another point of reference, I took a picture of part of a test pattern on an iPhone 6s Plus (on the left). Once again, I adjusted the exposure to compensate for the ML1 blocking 85% of the light. You should notice the blurry scattering of light that softens the image.

Mixing Real and Virtual Images

On the right is a picture of the Dr. G’s Invader teaser included with the ML1 showing a light in the background. Not only does the real-world light have a fuzzy double image but so too does the virtual image. The text is decidedly “soft”/blurry which I will cover in more detail in a future article. Having the two focus planes with double the diffraction gratings contributes to the problems with both the real world and the virtual image view. The image quality is getting degraded by passing through the many diffraction gratings.

Additionally, you might notice the colors in what should be solid white text. The colors change as you move your head and eyes, what Ryan Smith, CEO of Invrse, described as “like looking through soap bubbles.” There are other color issues including chroma aberrations (colors align with each other).

Conclusion – You Don’t Want to See the Real World Through a Diffraction Grating

Simply put, you don’t want to look at the real world through a diffraction grating, but that is what you are required to do with diffractive waveguides. No amount of money or effort is going to change the physics of diffraction gratings.

The ML1 is noticeably worse than Hololens regarding blurring one’s view and causing color artifacts from light sources in the real world, but it is a contest between bad and worse. The adage, “there is no right way to do the wrong thing,” applies. At least part of what is making the ML1 worse is the support of two focus planes. It requires you to look through double the number of waveguides

The diffraction grating “rainbow flare” effects and the darkening of the real world are the most obvious problems. But one should also consider that they are very optically inefficient with only a small percentage of the light making it to the eye. Thus, ML1 headset gets very hot while producing only about 200 nits.

What good is it to augment the world when you are ruining the view of the real world?

Next Time – Images Captured from the ML1

In the next article in this series, I plan on delving deeper into image quality (or lack thereof) with the ML1.

Disclaimer

It was recently reported in Bloomberg News that Magic Leap is going after a $500M military contract for use with troops in the field (as opposed to training). I recently joined RAVN who is developing technology that supports field-use military applications that utilize head-worn displays. Knowing that Magic Leap was using diffractive waveguides and the many inherent physics issues with looking through a diffraction grating among other severe problems with them for military use (only some of which are included in this article) and I welcome anyone to present any evidence to the contrary. As such, I never considered Magic Leap a direct competitor to RAVN.

I find it incredible (I’m struggling for a polite way to say this) that Magic Leap has a significant marketing effort pushing Magic Leap for military field (outdoor) use. I would never want to subject our troops to looking through diffraction gratings in the course of their duty. So even though I still consider it silly on a technical basis to consider them a competitor, they are certainly finding the same market to be attractive as a business.

Please also note that this blog reflects my personal opinions and not those of RAVN.

Acknowledgment

I would like to thank Ron Padzensky for reviewing and making corrections to this article.

Share this: Twitter

Facebook

