Your office is about to change forever. No longer will you be using a mobile phone, monitor, keyboard and mouse.

In 7 years you will not have a monitor anymore.

In 5 years you will not have a mobile phone anymore.

In 3 years you won’t need to touch type.

In 1 year if you’re an early adopter, you can skip the above 7 years.

So why am I so sure that the way you interact with your devices will change ? It’s obvious, when you know the technology that is currently available from some of the world’s leading tech firms.

Monitor

Why would you use a large stationary monitor in your office, another at home and be constrained by its physical size, location and resolution ?

When instead you could use a holographic monitor that; can be any size you want, adjustable by simply dragging the monitor with your fingertips; can follow you wherever you are; can be moved into any location easily; can be multiplied and can change your Windows desktop into a 3D workspace.

Mobile phone

Why would you carry a 7 inch phone around with you when it’s bulky to fit in your pocket and has a tiny visual display ?

When instead you could; view holographic text messages that can be placed anywhere in your field of view; read emails whilst performing other real-life tasks in the background; join a holographic interactive conversation with a friend, instead of a telephone call.

Keyboard and mouse

Why would you rest your laptop on your lap, work uncomfortably and have to carry a large device everywhere with you ?

When instead you could; talk instead of type; move your eyes around a holographic keyboard using eye tracking; move your (mouse) pointer around the display with your eyes; manipulate 3D holographic apps and interfaces with your hands; move your head anywhere to drastically increase your field of vision.

The answer to these and many other computer interface problems is AR or Augmented Reality. AR comes in the form of glasses currently defined as “wearables”, and for obvious reasons. When you wear these glasses, AR layers holograms on top of real life. AKA “modified reality” or "mixed reality" or “cinematic reality”;

"Augmented reality (AR) is a live direct or indirect view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data" - Wikipedia

So as well as seeing the real world, you also can see pretty much anything you and anyone else can conjure up in your minds. Now I know your mind is racing…

As it stands, there are many players in the AR market, but check out the main three;

HoloLens a. By Microsoft, need I say more… $Bn’s of investment, Microsoft’s CEO Satya said; “Once you use HoloLens, there’s no going back”. Untethered. Magic Leap a. Just last month Magic Leap raised $1.4Bn from some of these small dudes; Google, Morgan Stanley, JP Morgan, Alibaba and Warner Bros. Magic Leap are a startup and have already been valued at $4.5Bn. Adaptive focus. Meta a. Raised $2m in 2013 on Kickstarter ! Recently raised a further $23m in funding. Tethered.

In the next 5 years we will not be using mobile phones, as we know. We will instead wear AR glasses / soft cell contacts / similar (holographic) and have a stream of texts, emails (in whatever form), applications and real-time holographic interactions instead of phone calls.

AR will replace not just the mobile phone but every other interface we know, imagine how much businesses will save on advertising hardware and space.

The next step in the future of interfacing is the Brain-Computer-Interface / neuro-technology / communication by thought which I’ve written about; /james-mackie-publishes-new-theory-on-h2h-consciousness-and-thought-communication/ and a company have already implanted a graphene pad onto the brain to read and send signals to, but this is too far in the future for the purpose of this blog.

Take my word for it, the AR technology now is awesome, I’ve tested it. The HoloLens is already running at 60-90 frames per second 1080p video faster than the eye can see, Meta at 2560 x 1440 high-dpi display and talks of 4K are on the horizon. The holograms are so immersive you feel someone or something is with you. Microsoft, Magic Leap and Meta have definitely cracked their own niches, the HoloLens is magical, when you move your head, the holograms stay in the perfect position. Think of the use cases.

I don't want to spend too much time discussing the technology, this blog is really purposed to discuss the benefits of AR because essentially and eventually these differences will be inconsequential. But a little on the main 3 as of 2016;

The big difference between Meta and HoloLens is that Meta is tethered i.e. the glasses are connected via a cable to your computer to utilise the processing power of your computer. Whereas the HoloLens is a standalone device, their are pros and cons to both approaches whereby a tethered device can garner far superior processing power from the desktop computer, but the HoloLens is far more portable. Untethered will ultimately prevail because you will be able to harness remote cloud processing and storage and any local resource processing requirements will inevitably shrink in size. For many of my use cases, the device needs to be untethered.

I've noticed is that Meta has a larger field of view than the HoloLens so Meta wins there. Meta's hand interaction is no way near as good as Magic Leap, the demos are awesome that I've seen for Magic Leap when manipulating 3D holograms with your hands. Obviously some holograms will be out of arms reach so pointing, clicking and dragging with your fingers needs to be fine-tuned. Between them all, the technology is getting there.

But the key technology benefit that Magic Leap has over the other two is adaptive focus. Both Meta and the HoloLens use fixed focus; your eye is focussing on the small screen in front of you so the distance of the hologram can be an issue. Whereas Magic Leap uses clever optics technology to trick your eyes into focussing further away and I'm sure part of the reason for the oracles of Silicon Valley investing $1.4Bn.

REMEMBER; when you look at Google Maps on your phone, your phone pretty much stays in the same place. However with AR when you move your head, your hologram moves with you, or it can remain in a precise stationary location similar to a real-life object. What’s more, the hologram can overlay your entire field of view rather than just a small part (currently it's 90 degrees). So in the example of Google maps, roads would have new enlarged signs and information overlaid on top of the actual road. Rather than squinting at a small display.

Furthermore, Meta’s AR glasses can trace your hands and fingers so you can reach out for holograms, interact and manipulate them in real 3D space. You can walk around holograms because the headset knows where you are and tracks you through a number of sensors.

Here are some examples of ideas where AR would benefit businesses;

Enter a meeting room, adjacent to all of the attendees you see their LinkedIn profile; automatically know everyone’s names, previous jobs, current role, connections etc. How about public Facebook profile ? How about their current Tweets ?

Walk passed strangers in the street, automatically identify if there is a mutual and potential business relationship such as LinkedIn 2nd or 3rd degree connection, or office-based geographic proximity matching especially if you are both far from the office.

A lawyer in court could have reference material in their line of site when addressing the court, or even have paralegals researching and displaying info in front of them as and when they need it. Who wants to be a millionaire ?

People giving presentations could have their speech play out in their peripheral vision instead of looking down at papers or laptops.

Accountants would no longer be limited to one, two or three screens, and could sprawl out spreadsheet after spreadsheet next to one another for comparison.

Take video conference one step further, full virtual meeting attendance in a AR meeting room, further enhancing the video conferencing experience.

Same with recruitment consultants interviewing candidates, the remote interview experience could be fully immersive.

Meta AR already allows you to sculpt art (the first domain I ever purchased was Skulpt.com for this purpose) with your hands in 3D AR space and then print the object directly to a 3D printer.

The purpose of this blog is not to cover consumer ideas, but I couldn’t resist pondering a few other cool uses for AR when you’re not at your desk;

Sit next to a stranger on a plane or walk passed a stranger in the street and automatically know if there is a connection; mutual Facebook friend, recognise LinkedIn degrees of connection, previous train tickets purchased on the same train etc.

Walk passed a stranger; maybe a potential Match.com match? See them in real-life, rather than 2D. A far better type of dating, matching service than the conventional.

Don't have a chessboard but want to play chess ?

Look at everything through a different perspective ? How opposites; men / women see the world ? See the world from a physics (e=mc2) perspective ? See the world how dolphins see their environment ?

Look at the stars and know the history and current status of the stars, planets and other celestial objects you're looking at. Go into space and move around these objects.

Project directions onto the actual road.

Project prices of restaurant menus and cinema showing times as you physically walk passed. Show retail sales with a click flash through of all the men’s clothes in the shop within 3 seconds as you simply walk by without entering. And more importantly, show user reviews, votes and through intelligent learning; display what is most recommended for you.

Teach any sport, skill or profession with a far higher degree of immersion than ever before.

Learn about biology or astrophysics in a far more immersive way.

Vicarious living through other people’s lives for the day ? Or week ? Or year ?

Implement, and get used to, your favourite home designs in any home, wherever you are.

Full on Vanilla Sky interaction with gaming, home computers and software applications.

Fancy a change of scenery ?

If a person permits, view their health states; heart rate, BP.

Using a public AI API such as Microsoft Project Oxford, you could read people’s emotions and have your AR glasses recognise friends, or anyone’s emotions.

From the comfort of my own home, have Louis Vuitton and Ralph Lauren as my personal shopping assistants, passing me their brands of clothing to try on; see more detailed explanation below.

Now going out there a little more;

Want to draw a toy gun and shoot your friend with the gun you just drew, that you just 3D printed ? I mean, talk about creativity.

What if I had to run to the shops and didn't want to leave my children home alone ? I could leave my children in the real world but I could still be virtually parenting or singing them to sleep. Obviously the AR headsets would have to be constructed lighter and child-friendly.

What if I developed a new best friend, and using AI my new entity understood my needs and how to make me a better person ? What if he / she could look like (or different) to my ultimate superhero best friend, with improvements. What if I could pre-program my superhero best friend to answer my questions pre-programmed from my favourite philosopher ?

Make everything look beautiful ? Your girlfriend ? Or just change her hair colour / style maybe ?

Do you have OCD ? Scrub out the mess so you don’t see it.

Cameras can now clearly read the grayscale, pigment and certain cellular structures of your skin to monitor your heart rate. What if you could use AR glasses to clearly feedback to you if someone is not being honest with you. What if your AR glasses could detect that someone behind you is about to attack or mug you based on their vital readings ? What if using 3d spacial sound you could amplify sounds around you to hear what people may not want you to hear ?

The possibilities are endless… Predictions are that AR will be worth £100Bn in the next 5 years, hopefully now you can see why.

When I think deeper about AR, I feel like I can flood my brain with so much more information. Some might say, now why on earth would you want to do that ? In some cases I agree, I want to be less connected to the mobile world. BUT there are times when I want more information, take shopping as an example, I don’t want to walk around 25 shops in Westfield when I can simply walk passed them and flood my brain with 10 pairs of jeans per second that are my size, within my budget and the specific colour I’m looking for. Then I can walk into the shop.

If I don’t want to go to the shops, I can just find the shoes I like and try the holographic shoe version on. With my hands. Taking this a step even further what about an AI-retail-bot recommended shopping trip, where I sit down in the comfort of my own home and the best designers in the world personally flood me with the shoes I know I want, the new suit I know I want. And whilst I’m sitting in my room I have Louis Viton personally handing me new hand luggage to try, and Ralph Lauren himself carefully passes me shirts to try on; the colour, size and type I think that I might need based on my previous purchasing habits and what other people purchased based on my purchasing habits. I think I would prefer this over a kitkat or coffee break !

Next, I’m hungry for lunch, I simply walk passed the restaurants and based on intelligent learning my AR device not only knows what food I like, and maybe am in the mood for, but it shows me the reviews from previous visitors of the best rated food.

Let’s say now I’ve had a few drinks at lunch and I want to meet people, so I publicly open my AR periphery to “meet new people in my location”, when someone walks passed me, they see that I am available to meet.

And as with any mobile device, if I want to switch it off, I can.

I love what Satya, the CEO of Microsoft, said the other day; “We want to move from people needing Windows, to choosing Windows, to loving Windows” and I think that’s what the HoloLens will do for Microsoft. Probably another reason why Bill Gates has chosen to pay Satya, a long term Microsoft employee, $84m per year.

For me the HoloLens, Meta and Magic Leap are about 3 things; Learning, Experiencing and Interfacing.

I’m going to say it now, AR will be the next big technology megatrend.

We here at Nasstar are making roads to offer the first ever virtual desktop through an AR HoloLens. As an organisation we have applied for the first release of the development kit of the HoloLens.

In 2017 you will be able to buy the headset from us with Windows 10 or Windows Holographic VDI running all of your favourite applications, of course alongside all of our other services we offer.

About the Author in context

James Mackie

I’ve been interested in holographic interfaces and 3D computing for at least a decade. I read about the HoloLens in 2012 in a leaked Microsoft document. I purchased the Oculus 1 VR headset in 2012 and since then have been captivated by VR and AR.

I am currently head of Sales & Marketing for Nasstar. I became involved with Nasstar when they acquired my company VESK; the UK’s fastest growing provider of hosted desktops.

I only work in fields I am passionate about, and Nasstar allow me to help plan their future roadmap as well as help with technological innovation and strategy. I see AR as a big part of that for VESK and Nasstar.