Last year at WWDC 2017, Apple launched ARKit. Using this technology, developers can create mixed reality applications on the iOS platform quickly and use their device’s cameras to help augmented reality come to life.

In this article, we will integrate ARKit in a video conference scenario. This article describes the implementation of two scenarios in the video:

Integrate ARKit with live video streaming

Render the live video stream to the AR plane using Agora’s Video SDK

We will be using ARKit to detect a plane in the room and then use the Custom Video Source and Renderer function, included in Agora.io Video SDK v2.1.1, to render the live video stream onto the plane. This will end up giving a holographic feel to the video call, just like you see in Star Wars! The source code for this demo is included at the end of the article. Just add your Agora.io App ID to the ViewController.swift file and run the app on your device!