Another day when we talked with my wife about how we can leverage augmented and virtual reality for real-world cases. We flawlessly shifted to the idea that AR and VR have something in common with solutions, but are different in approaches.

Just look, room-scale VR headsets are so exciting because they allow user freely move around. We call that thing “6 degrees of freedom”. They got it by using a sophisticated system of sensors. ARKit and ARCore are great because they allow user moving around an object. Really it’s the same “6 fields of freedom”, but without hardware setup. Mobile AR is relying only on software and a single camera.

After you realized this, life will never be the same. We felt ourself as we were ripped off. To proof this theory we built AR demo. Here user don’t move around a model, he moves inside it and looks around. To make demo even more exciting we made it for real live project.

My wife is an interior designer. She has used 360 renders, VR scenes and AR for furniture objects. It allows her to do a lot of work remote. So there was the situation when she needed to show kitchen design to the customer who was thousands of km away, on the construction site. So we build the ARKit demo of his kitchen design . Everything was modeled and we made the window glass transparent. It was made for being sure that projected and real worlds are perfectly aligned. And it worked!

More on innasparrow.com

Using such application on construction site, or anywhere else you can naturally scale up small details and freely move around to feel 1:1 scale of the future apartment. It’s even more exciting than ordinary room-scale VR scene.

More on innasparrow.com

Let’s be honest VR demo has advantages as a user is fully immersed in a scene using a stereo headset. But wait, AR-ready phone can also show stereo picture in VR goggles. So why not to use simple cardboard, keep hole for a phone camera, and use it for tracking in 6 dof?

Really we are not so far from true mixed reality. What if your phone will have cameras that are placed from each other on the same distance as the average distance between human eyes? It’ll allow to project two separated video streams for two eyes and overlay virtual content from slightly different perspective. Stereo effect will create realistic feeling of deepness. I hope that this distance also will help to recognize how real objects are placed on a scene to mask covered parts of augmented objects.

It’s so easy that looks like magic. But practically it doesn’t look like hardware in phones is powerful enough to do twice more calculations. Even now projected environment isn’t enough stable and often shifts a little bit. And if on phone screen it’s ok, for fully immersive virtual reality headset it doesn’t work, a user will feel dizzy very fast.

I’m looking forward to those time when we’ll get finally next level of AR hardware.