Learn how to make augmented reality apps with Android using ARCore!

Update note: Zahidur Rahman Faisal updated this tutorial for Android 7.0, Android Studio 3.5 and Kotlin 1.3. Joe Howard wrote the original.

These days, most people who own a smartphone have played augmented reality-based games like Pokémon GO or Harry Potter: Wizards Unite. Did you ever wonder how those apps augment a wonderful world around you and keep you engaged? Well, find out by making your own augmented reality (AR) app.

You can build augmented reality apps using OpenGL, Unity and Unreal. In this tutorial, you’ll get started with Android ARCore by building on top of a modified version of the OpenGL sample app that Google provides.

Note: This tutorial assumes that you’re familiar with Kotlin. If you’re just getting started with Kotlin development, please check out our tutorial, Kotlin For Android: An Introduction , first.

Introduction to ARCore

At WWDC June 2017, Apple announced ARKit, its foray into the world of AR development. Two months later, Google announced ARCore, which it extracted from the Tango indoor mapping project.

Tango only works on particular devices that have a depth sensor, while ARCore is available on most modern Android devices.

ARCore relies on three key mechanics to augment the real world around you:

Motion tracking: ARCore determines both the position and the orientation of a virtual — simulated — object in the real world using the phone’s camera and sensor data. This is called the object’s pose. ARCore tracks the pose of the virtual objects in a scene while your phone moves. This lets it render those objects from the correct perspective according to your position.

Environmental understanding: ARCore can detect horizontal or vertical surfaces like tables, floors or walls while processing input from your device’s camera. Those detected surfaces are called planes. ARCore attaches virtual objects to a plane at a fixed location and orientation. Those fixed points are called anchors.

Light estimation: ARCore understands the lighting of the environment. It can then adjust the average intensity and color of virtual objects to put them under similar conditions as the environment that surrounds them. This makes the virtual objects seem more realistic.

The race to conquer this emerging technology is on, now more than ever. Get ready to explore the brave new — augmented — world like a Viking!

Getting Started

Did Vikings have cannons in reality? Whether they did or not, there’s no reason Vikings can’t have cannons in augmented reality! :]

Your goal for this tutorial is to augment a scene — a Viking pointing a cannon at a target — and project the scene around you using your Android device.

Start by downloading the project materials by clicking the Download Materials button at the top or bottom of this tutorial. Open the begin project in Android Studio 3.5 or later.

The project won’t compile right now. Don’t worry, it will in a while.

Setting up the Environment

Before you start building your scene, you need a device that’s capable of running ARCore. Here are two options; you can pick the one that’s right for you.

Using a Real Device

To run ARCore apps on a real device, you need to check if your device has ARCore support.

Google offers a full list of ARCore-supported devices. If you’re lucky and find your device on this list, you’re good to go!

Open Google Play Store from your device and install Google Play Services for AR. This service contains support libraries you need to run ARCore-based apps.

If your current device doesn’t support ARCore, don’t worry. You can still run it on the Android Emulator!

Using an Emulator

To use ARCore on an emulator, you need to create an Android Virtual Device (AVD) that can run ARCore apps. Give it the following configuration:

When creating your AVD, you must choose a Pixel, Pixel 2 or Pixel 3 hardware profile.

Set the system image to Oreo: API Level 27: x86: Android 8.1 (Google APIs) or a higher API Level.

Now, set the back camera to VirtualScene by navigating to Verify Configuration ▸ Show Advanced Settings ▸ Camera Back and picking VirtualScene from the drop-down.

Next, download Google Play Services for AR for your AVD from Github.

Boot up your AVD and drag the downloaded APK onto the running emulator.

You can also install the APK using the following command while the Virtual Device is running:

$ adb install -r Google_Play_Services_for_AR_1.14_x86_for_emulator.apk

Adding ARCore Dependencies and Permissions

Before you build and run the app, you need to add the dependencies and permissions ARCore needs. Open the app-level build.gradle file and add the following line to dependencies {...} :

implementation 'com.google.ar:core:1.14.0'

Click Sync Now in the top-right corner to sync your project and your dependencies will be in place.

Next, open AndroidManifest.xml and add the following after the manifest tag:

<uses-permission android:name="android.permission.CAMERA" /> <uses-feature android:name="android.hardware.camera.ar" android:required="true" /> <uses-feature android:glEsVersion="0x00020000" android:required="true" />

This code adds the necessary permission and feature requests for ARCore.

Also, add the following just before the closing application tag:

<meta-data android:name="com.google.ar.core" android:value="required" />

This adds the ARCore metadata.

Now, you’re all set to launch your first ARCore app! Build and run the project.

You’ll see a prompt to provide camera permissions. After you give permission, you’ll see a radio group at the top. You’ll use that later to select the type of object to insert into the scene.

You’ll see a snackbar at the bottom indicating Searching for surfaces…. You may also see a few points highlighted, which means that the app is tracking them.

Aiming the device at a flat surface, you start detecting planes:

Once the app detects the first plane, the snackbar disappears and you’ll see the plane highlighted on the screen.

Note: ARCore uses clusters of feature points to detect the surface’s angle. Thus, you might have trouble detecting flat surfaces without texture or light-colored planes like a white wall.

At this point, the app doesn’t do much, but take some time to check out its code to get your bearings… especially before you set a Viking up with a cannon!

Behind the ARCore Scene

The 3D models you’ll use are in main/assets/models. Here, you can find models for a Viking, a cannon and a target.

The OpenGL shaders are in main/assets/shaders. The shaders are from the Google ARCore sample app.

You’ll see a package named common. Inside, there are two folders:

rendering: This folder contains all the classes related to OpenGL rendering. helper: Here, you’ll find utility classes like CameraPermissionHelper and SnackbarHelper , so you don’t have to write boilerplate code.

Planes, Anchors and Poses

You have a PlaneAttachment class inside rendering to ease your job. PlaneAttachment uses a plane and an anchor as inputs. It constructs a pose from that information.

For a quick recap:

A plane is a real-world planar surface. It consists of a cluster of feature points that appears to lie on a horizontal or vertical surface, such as a floor or walls.

An anchor points to a fixed location and orientation in physical space to describe the exact position of a virtual object in the real world.

A pose describes a coordinate transformation from a virtual object’s local frame to the real world coordinate frame.

Note: All three classes are part of the ARCore SDK. You can read more about each of them in the official documentation

Imagine the real world around you is an ocean and the planes are ports. A port can anchor many ships, or virtual objects, each with their specific pose.

So, PlaneAttachment lets you attach an anchor to a plane and retrieve the corresponding pose. ARCore uses the pose as you move around the anchor point.

ARCore Session

The begin project includes an ARCore session object in MainActivity.kt. The session describes the entire AR state. You’ll use it to attach anchors to planes when the user taps the screen.

In setupSession() , which you call from onResume(...) , the app checks if the device supports ARCore. If not, it displays a toast and the activity finishes.

Augmenting Your First Scene

Now that your app is running on a supported device or emulator, it’s time to set up some objects to render in the scene!

Adding Objects

Open MainActivity.kt and add the following properties:

private val vikingObject = ObjectRenderer() private val cannonObject = ObjectRenderer() private val targetObject = ObjectRenderer()

Here, you define each property as an ObjectRenderer from the ARCore sample app.

Also, add three PlaneAttachment properties just below those objects:

private var vikingAttachment: PlaneAttachment? = null private var cannonAttachment: PlaneAttachment? = null private var targetAttachment: PlaneAttachment? = null

These are Kotlin nullables initialized as null You’ll create them later, when the user taps the screen.

Now, you need to set up the objects, which you’ll do in onSurfaceCreated(...) . Add the following inside the try-catch block, just below the // TODO comment:

// 1 vikingObject.createOnGlThread(this@MainActivity, getString(R.string.model_viking_obj), getString(R.string.model_viking_png)) cannonObject.createOnGlThread(this@MainActivity, getString(R.string.model_cannon_obj), getString(R.string.model_cannon_png)) targetObject.createOnGlThread(this@MainActivity, getString(R.string.model_target_obj), getString(R.string.model_target_png)) // 2 targetObject.setMaterialProperties(0.0f, 3.5f, 1.0f, 6.0f) vikingObject.setMaterialProperties(0.0f, 3.5f, 1.0f, 6.0f) cannonObject.setMaterialProperties(0.0f, 3.5f, 1.0f, 6.0f)

What you’re doing here is:

You use the 3D model files from the begin project to set up each of the three objects. You set values for ambient, diffuse, specular and specular power on each object. These material properties are the surface characteristics of the rendered model. Changing these values changes the way you see the surface of the object.

Here’s a closer look at what each of the light values does:

Ambient: The intensity of non-directional surface illumination.

Diffuse: The reflectivity of the diffuse, or matte, surface.

Specular: How reflective the specular, or shiny, surface is.

Specular Power: The surface shininess. Larger values result in a smaller, sharper specular highlight.

Play around with these values to see how your object changes.

Note: If you want to learn more about light cues and concepts, check out Google’s tutorial on using ARCore to light models in a scene

Attaching Anchors to the Session

Your next step is to give the user the ability to attach an anchor to the session when they tap on the screen.

To get started, find handleTap(...) in MainActivity.kt. Add the following inside the innermost if statement, just above the // TODO comment before the break statement:

when (mode) { Mode.VIKING -> vikingAttachment = addSessionAnchorFromAttachment(vikingAttachment, hit) Mode.CANNON -> cannonAttachment = addSessionAnchorFromAttachment(cannonAttachment, hit) Mode.TARGET -> targetAttachment = addSessionAnchorFromAttachment(targetAttachment, hit) }

You’ll see an error because you still don’t have addSessionAnchorFromAttachment(...) . You will address this error in a while.

The radio buttons at the top of the screen control the value of mode . This is a Kotlin enum class that includes a scale factor float value for each mode. The scale factor tunes the size of the corresponding 3D model in the scene.

For each mode, you set a new value for the corresponding PlaneAttachment in the when statement.

You use the old attachment and the hit value for the tap, which is an ARCore HitResult defining the intersection of the 3D ray for the tap and a plane.

Now, add addSessionAnchorFromAttachment(...) at the bottom of MainActivity.kt:

private fun addSessionAnchorFromAttachment( previousAttachment: PlaneAttachment?, hit: HitResult ): PlaneAttachment? { // 1 previousAttachment?.anchor?.detach() // 2 val plane = hit.trackable as Plane val anchor = session!!.createAnchor(hit.hitPose) // 3 return PlaneAttachment(plane, anchor) }

What you’re doing here is:

If the previousAttachment isn’t null, you remove its anchor from the session. You take the HitResult plane and create the anchor from the HitResult pose. You then add the anchor to the session. Finally, with the above information about the plane and the anchor, you return the PlaneAttachment.

You’re almost ready to see your Viking do some target practice! :]

Drawing the Objects

Your last step is to draw the objects on the screen. You created plane attachments when the user taps, but now you need to draw the objects as part of the screen rendering.

To do that, go to onDrawFrame(...) and add the following calls to the bottom of the try block:

drawObject( vikingObject, vikingAttachment, Mode.VIKING.scaleFactor, projectionMatrix, viewMatrix, lightIntensity ) drawObject( cannonObject, cannonAttachment, Mode.CANNON.scaleFactor, projectionMatrix, viewMatrix, lightIntensity ) drawObject( targetObject, targetAttachment, Mode.TARGET.scaleFactor, projectionMatrix, viewMatrix, lightIntensity )

Here, you call the pre-existing drawObject(...) helper function. It takes the object, its corresponding attachment, the scale factor and the matrices and values needed for OpenGL to draw the object.

The app computes the matrices using these already-present helper functions:

// 1 private fun computeProjectionMatrix(camera: Camera): FloatArray { val projectionMatrix = FloatArray(maxAllocationSize) camera.getProjectionMatrix(projectionMatrix, 0, 0.1f, 100.0f) return projectionMatrix } // 2 private fun computeViewMatrix(camera: Camera): FloatArray { val viewMatrix = FloatArray(maxAllocationSize) camera.getViewMatrix(viewMatrix, 0) return viewMatrix } // 3 private fun computeLightIntensity(frame: Frame): FloatArray { val lightIntensity = FloatArray(4) frame.lightEstimate.getColorCorrection(lightIntensity, 0) return lightIntensity }

Here what’s going on in the code above:

ARCore uses the current session’s camera input to calculate projectionMatrix . It also uses that input to calculate viewMatrix . Finally, it uses the frame, which describes the AR state at a particular point in time, to calculate the lightIntensity .

Build and run, then select a radio button at the top to pick an object mode. Find a plane with your camera and tap to place an object.

The angle an object has when you place it depends on your device’s orientation and inclination. Move your device around and place your object with the angle you prefer.

Once you’ve placed all the objects, if you rotate your phone, you’ll see a scene like this:

Move around the scene and watch as your Viking prepares to fire. There’s no stopping your Viking now!

Where to Go From Here?

Check out the end project by clicking the Download Materials button at the top or bottom of this tutorial.

You’ve reached the shores of ARCore’s world with OpenGL and Android Studio like a Viking! To discover more, check out the official ARCore Overview.

Loot these resources to enrich your ARCore skill inventory:

You can also use ARCore with Unity, Unreal and Web. Since a good part of your development with ARCore will likely rely on Unity, take a look at our Unity content.

Finally, you can find some cool demos made with ARCore on the Google Experiments site.

I hope you enjoyed this brief intro to ARCore with Kotlin. If you have any questions or comments, join the forum discussion below.