Today, Google is announcing the debut of ARCore, an initiative to bring mobile-powered AR experiences to the masses like never before. Previously Google’s Tango was the best way to see powerful AR projects in action, but that required extra cameras and sensors on high-end smartphones to work. Now, ARCore is aiming to democratize augmented reality for the Android ecosystem by offering a software-only solution.

“[Today] we’ll be announcing a preview of something we call ‘ARCore,'” said Clay Bavor, VP of Augmented and Virtual Reality at Google, during an interview with UploadVR. “It’s an SDK for Android developers to build AR experiences for Android phones — a software only solution for doing stuff. So basically we’re bringing much of the goodness of Tango onto a very broad range of AR devices.”

Starting right now, Google is making the ARCore SDK available to owners of the Google Pixel (running Oreo) and Samsung’s Galaxy S8 (running at least 7.0 Nougat,) with a target of running on millions of devices by “this Winter,” according to Bavor. Other Android devices from Samsung as well as smartphones from LG, Huawei, and ASUS are expected to all get support over time as well.

In an aim to make it as easy as possible to develop AR applications, ARCore will work with Java/OpenGL, Unity, and Unreal from day one. It’s aiming to leverage three core principles, according to a prepared statement from the company:

Motion tracking: Using the phone’s camera to observe feature points in the room and IMU sensor data, ARCore determines both the position and orientation (pose) of the phone as it moves. Virtual objects remain accurately placed.

Using the phone’s camera to observe feature points in the room and IMU sensor data, ARCore determines both the position and orientation (pose) of the phone as it moves. Virtual objects remain accurately placed. Environmental understanding: It is common for AR objects to be placed on a floor or a table. ARCore can detect horizontal surfaces using the same feature points it uses for motion tracking.

It is common for AR objects to be placed on a floor or a table. ARCore can detect horizontal surfaces using the same feature points it uses for motion tracking. Light estimation: ARCore observes the ambient light in the environment and makes it possible for developers to light virtual objects in ways that match their surroundings, making their appearance even more realistic.

Hands-On Impressions

During a visit to Google’s San Francisco office last week I got the chance to see three different ARCore demos showing off what the SDK can do. Everything has been built off of the foundation laid by Tango previously and is proof of the scalability of the technologies that Google is creating. At the meeting I got to speak with Jon Wiley, the Director of Immersive Design, and Nikhil Chandhok, Director of Product for Google AR, as well as the aforementioned Clay Bavor.

The first demo was a relatively standard AR proof of concept that let me place 3D models on a tabletop, move them around, and resize them. I could play around with a tree, house, mountain, and little Android robot mascot. As great as it was the most impressive thing about the whole demo was that every model was created inside Blocks, the latest VR 3D modeling program from Google. This platform, combined with Tilt Brush, is dramatically lowering the barrier to entry for intrepid designers and platforms like ARCore only serve as a means to further expand access.

One of my other favorite bits of the demo is how the little Android robots wobbled around and walked across the table. If I leaned down and put the phone close enough they’d even look at me and wave. Everything persisted if I moved the phone away and pointed at the ground and the camera was able to even track the location and plane of flat surfaces such as the table and floor. This meant I could move models from one surface to the next and they’d retain their scale and size relative to the rest of the environment.

All of that without any depth sensors or extra cameras on the phones. It was running on a Google Pixel.

The second demo Google showed me was one focused on large, life-sized 3D modeled characters. Each of the characters on display (a lion, tin man, and scarecrow) were all themed after The Wizard of Oz, because why not? They took me to another corner of the room and placed the lion next to chair with a light source behind him. He stood there and the light cast shadows across his torso in a surprisingly realistic manner.

Then, Jon Wiley stepped into the frame and stood next to the lion as it towered over him, similar to how I’m standing in the image above. The lion then recognized his presence, looked down at him, and flexed its muscles to try and display his superiority. Then Elizabeth Markman, a Communications Manager at Google, turned off the lights. The lion grabbed his trail, looked up at the ceiling, and quivered in fear. It was a remarkable series of events and it all played out flawlessly right before my eyes.

The final demo I saw during my meeting was the most practical. Using a plugin on the Wayfair website Nikhil Chandhok measured a corner of our meeting room using his finger on the phone’s screen. He dragged a cursor to represent the length, width, and height of the type of chair he wanted and then the Wayfair website displayed results only for chairs that would fit in that space. I can see this type of technology being used to buy furniture as shown, to buy paint for walls, sheets and blankets for beds, pillows for couches, and so much more. It’s exciting to think about.

Interestingly this latest example on the Wayfair website is the first I’ve seen of what Clay Bavor described as “WebAR” wherein the user doesn’t actually need to have a special application installed on their phone to get it to work. Instead, just by visiting the website that has the ARCore code implemented with a compatible browser, the phone can automatically channel an AR experience from the web directly.

In a world where Apple already has ARKit it was only a matter of time before Google unveiled something similar. With support for both Pixel and Galaxy S8 devices starting today, and even more Android phones in the near future, the number of AR-capable smartphones in the world is starting to dramatically increase. You can read more about Google’s plans for ARCore and what it means for immersive computing right here.

What do you think of Google’s ARCore? Do you have plans to develop for it? Let us know down in the comments below! For more information about ARCore you can read the official Google blog post here and find the SDK and development details here.