By Arjun Jasyal

This article doesn’t talk about Augmented Reality but it is about those devices which enables us to do the AR.

​Augmented reality is about having computer generated digital objects placed in the viewers reality such that, the complete experience appears to be real to true life!

But an AR device is much more then that. Many of the users can’t see the full capabilities of these AR devices and most of the time are biased towards what an AR device is intended to do. Toady I try to pull off this curtain of bias which neglects other important features of an AR device.

First of all, what bias I am talking about? It is about how an AR device is judged, which is only based on how the end user AR experience looks like on the device or to be precise “what user SEE through it”.

There is no harm done in rating an AR device on the basis of how good the experience looks through it but when the said is the only criteria for doing so, then it is unfair and neglects other features of the device which are much more important.

There are many different devices in the market with different form factors. These AR devices are basically categorised either as smartphone AR like ARkit on iPhone X or Head Mounted devices like Microsoft’s HoloLens.

Both these devices do there job of putting digital objects in viewers reality but the experiences on both the devices are quite different. Hololens uses waveguides and put digital object right in the point of view of the user. iPhone’s ARkit does this by putting digital object in a video feed which is recorded through it’s camera. Both the devices are great and serves there intended purpose.

One of the short comings of Hololens is it’s field of view. In the case of iPhone X, the FOV equals to the FOV of the camera it has, which is much bigger then Hololens’s FOV. Many users makes this the biggest point when they debate which device’s AR experience is better.

Even though the ARkit is limited to what it can do in mapping the environment around the user but many users still find a better experience in ARkit then Hololens only because iPhone has a much bigger FOV.

This remains true in some applications only. But when looked beyond and other capabilities and applications are explored for the AR devices, the experiance become terrible on iPhone X and much better on Hololens. This is when, one realises that an AR device is not about what you see through it but infact what that AR device is capable of to see through it’s sensors!

One of the examples I can give is dimension measuring applications. Both Hololens and ARkit platforms have there own measuring applications made by indie developers. Here’s a video of the comparison of the accuracy of measurement between the two platforms.

https://youtu.be/turasRyb5nY

As you can see in the video, Hololens was off the target by just 1 or 2 cms. But, iPhone was very off the target and can be said useless for approximate calculations.

I am not trying to make iPhone AR looks inferior to that of the Hololens but trying to make a point that a better AR experience doesn’t come with big FOV but instead with better tracking by the sensors it has.

A better AR device is not about bigger FOV or realistic graphics, but about how much better an AR device can understand the environment around it. It is about how much inputs it can take from the user and makes the experiance better not only through realistic visuals but through realistic interactions. One example which I can give here is this developer controlling a quad copter with hand gestures using a Hololens.

http://www.thejumperwire.com/articles/controlling-a-quadcopter-with-hand-gestures/

This application makes an AR device much more then just a device to see digital objects in real world. An AR device with enough sensors as on Hololens is capable to control any of the other device around us.

AR device is always much more about the input it can take rather then what output it is making. The first job of an AR device is to understand the environment around it. After this, Placing Digital objects in the real world can be done by any computer on this planet and the output can be seen through any of the display unit you may have.

One of the first major device which has the capability to understand the environment around it was the optical mouse which is still present on every desktop in the world. An optical mouse knows what it’s position is on a 2D plane and hence comes under the category of being called an AR device on the basis of my article.

We have neglected this aspect of the devices by which they understand our environment and always give preference and fore most attention to the end user experience and the output which our devices will make. But now it should change.

In the foreseeable future, we will buy products which are more about taking instructions from us and understand our environment rather then the ones which show us the output.

This is what an AR device is. And i will rest myself here and look for the love and criticism you may have to offer me here. Give me claps if you liked it.