ARKit 1.5 added support for detecting images within the AR scene, which can be used to align your content to a specific play area or as the basis of a museum app. This example focuses on using the detection feature to highlight a painting in the room. If this was a museum app, you could pop up information about the artist, when the painting was created, the name of the piece, etc. This is a simple (aka lazy) version that draws a transparent plane over where the image was detected in the scene. Here’s a video of this app in action.

https://youtu.be/BLqupE_6VTs

The first step in the process of building this kind of app is to import the image(s) that you plan to search for in the scene. In UE4, this is as simple as importing a JPEG by dragging and dropping it into the content browser. Once this has completed there are some settings on the image that need be changed from the defaults (see below). Before the image can be supplied to ARKit, it needs to be converted from a UTexture that UE4 uses as its representation of image data. The conversion code only handles variants of uncompressed, 8-bit RGBA data (RGBA, ARGB, BGRA). Given the image is being used for detection within the scene, there’s no reason to have mipmaps, so I disabled those. I also change the texture group to UI since the rules for that group generally don’t change the image when packaging for the target device. (Apparently, I type World too much and merge Wolf and World in the texture name.)

Texture settings for the image to detect

Now that you have an image imported the next step is to create a wrapper around that image that provides a bit more information for ARKit and gives you a way to determine which image ARKit created an anchor for. The asset type for this is called ARCandidateImage, which is an option when creating a new Data Asset. The menu path is Miscellaneous | Data Asset and then select the ARCandidateImage option. The image below is the ARCandidateImage I created for the texture above. ARKit needs to know what orientation you expect the image to be in. It also needs the width and height of the image in the real world. This painting is roughly 91 cm by 122 cm in size. It uses these pieces of information to correctly place the anchor for the image in 3D space. Giving incorrect sizes can cause the anchor to be located too near the camera or too far away. The last piece of information is used by you when the image is recognized: ARKit passes the name of the image it detected as part of the anchor data. If you are using multiple images, this bit of information is used to map back to the originating ARCandidateImage.

ARCandidateImage for the wolf texture

The next step is to add this candidate image to your ARSessionConfiguration. Since this app is based off of the Handheld AR Template we provide with UE4, you need to add it to D_ARSessionConfig. This is done by adding an entry to the CandidateImages array and pointing to the wolf candidate image asset, as seen below.

ARSessionConfiguration to detect images in a AR scene

There’s one last step that happens during the start of the AR session. The image is in a UE4 format and needs to be converted to something ARKit can work with. Like many ML image algorithms, ARKit converts the data to gray scale first. Keep this in mind if you are designing a play mat to build your AR games on top of. Color alone may not be enough for ARKit to understand the images. The image below is from the Preview feature in XCode when looking at a variable in the watch window (my favorite feature of XCode). Not that this is important, but the image is also resized during the conversion.

Post ARKit conversion

At this point, ARKit will start searching for the wolf painting in the scene. Once it finds an image, it creates an anchor that you can use to place content at. My simple Blueprint below does exactly this. Using Tick, each frame it checks to see if any image anchors have been detected. If an image anchor is found, it spawns a 80% translucent plane in 3D space 10 cm closer to the camera than the image anchor.

Blueprint to spawn the plane when the image is detected

The 4.20 build has many more AR features coming your way. In addition to the image detection, ARKit 1.5 added vertical plane support, which we now support too. Let me know if you find some cool game ideas on how to use the image detection.