Developers hoping to incorporate augmented reality into iOS games and apps will be able to get some assistance from Apple itself during the upcoming Game Developers Conference, as part of the 'Introduction to Apple's ARKit' session presented by the iPhone and iPad producer.

Viewable in the online session scheduler for GDC 2018, the lecture "Introduction to Apple's ARKit: Best practices and recent updates" will be presented by Michael Kuhn, the leader of Apple's ARKit engineering team.

The session will introduce attendees to the core concepts of the ARKit framework, including underlying principles and the associated API. Participants will be told how to start using the various tracking and scene understanding capabilities of ARKit, as well as how to integrate the framework into rendering and game engines.

It will also highlight the best practices developers should consider when producing AR content, including creating the experience, object placement in the real-world view, interacting with virtual items, and the applications of AR in games. Basic concepts and challenges of AR and computer vision will also be discussed, raising issues about common problems developers can run into while producing their apps.

According to the speaker's profile, Kuhn has a background in computer science, and has worked on AR and its relating technologies since 2004, giving him a strong understanding of both the challenges and the potential of AR content. This body of knowledge led to Kuhn joining Apple in 2015, joining and leading the team working on ARKit.

Since introducing ARKit at WWDC 2017 and released as part of iOS 11, Apple has been keen to help developers adopt the framework in their apps, to the point that it even offers basic lessons on its usage in Swift Playgrounds. The GDC session is likely to attract more developers interested in using the technology in their games and apps, as well as those looking to incorporate the latest updates to ARKit in their releases.

Recently, Apple revealed an update bringing ARKit to version 1.5, providing developers access ahead of an expected release in the spring of 2018 in iOS 11.3. Changes in the framework include the expansion of horizontal plane detection to vertical surfaces, a refinement to its surface mapping to tackle irregularly-shaped surfaces, and a higher resolution real-world view.