Google has announced ARCore, which brings augmented reality (AR) functionality to Android smartphones starting today. This is great news because you can immediately get started with Google’s developer preview of ARCore, which includes Unreal Engine support!

ARCore enables AR development across the Android ecosystem, giving developers the ability to make compelling AR experiences without the need for any additional hardware.

Today, the ARCore SDK supports the Google Pixel, Pixel XL, and the Samsung Galaxy S8 running Android 7.0 Nougat and above. As the developer preview is refined, Google is adding new devices, with a target of 100 million devices at launch.

We at Epic are working to empower developers to create amazing AR experiences using Unreal Engine, which is gaining deeper AR platform support by the day. Unreal Engine 4.18, coming in mid-October, is shaping up to be a major release for AR, with more mature ARKit support, along with Beta support for ARCore.

“Augmented reality is the next step in the evolution of the smartphone, and Unreal Engine developers are already hard at work on great AR experiences. ARCore will help further drive AR adoption by empowering developers to build and ship cross-platform AR experiences. We encourage the Unreal community to check out today’s Beta Unreal Engine 4 support for ARCore on GitHub as well as the preview coming in Unreal Engine 4.18.” - Mark Rein, Co-Founder and Vice President, Epic Games





Let’s talk a bit more about how ARCore works. There are three main components that help transform how mobile users see the world: motion tracking, environmental understanding and light estimation.

Motion Tracking

As your mobile device moves through the world, ARCore combines visual data from the device's camera and inertial measurements from the device's IMU to estimate the pose (position and orientation) of the camera relative to the world over time. This process, called visual inertial odometry (VIO), lets ARCore know where the device is relative to the world around it.

By aligning the pose of the virtual camera that renders your 3D content with the pose of the device's camera provided by ARCore, virtual content is rendered from the correct perspective. Rendered virtual images are then overlayed on top of the image obtained from the device's camera, making it appear as if the virtual content is part of the real world.

Environmental Understanding

ARCore is constantly improving its understanding of the real world environment by detecting feature points and planes. Feature points are visually distinct features in the captured camera image that ARCore can recognize even when the camera's position changes slightly. ARCore estimates pose changes by triangulating on these feature points between successive frames.

ARCore looks for clusters of feature points that appear to lie on common horizontal surfaces, like tables and desks, and makes these surfaces available to your app as planes. ARCore can also determine each plane's boundary and make that information available to your app. You can use this information to place virtual objects resting on flat surfaces, such as a character running around on the floor or a table.

Light Estimation

Finally, through light estimation ARCore can detect information about the lighting of its environment and provide you with the average intensity of a given camera image. This information enables you to light your virtual objects under the same conditions as the environment around them, increasing the sense of realism.

Without further ado, we encourage you to check out the ARCore developer preview and start experimenting today. Keep an eye out for more learning resources and supporting content for ARCore development as we get closer to the 4.18 release. We can’t wait to see your work, so if you decide to share it, tag us using @UnrealEngine and #UE4 on Twitter, Facebook or Instagram so that we can share it, too!