This post presents a livecoding screencast of Augmented Reality 3D Pong. This is an experiment to use augmented gestures as a way to interact with game. So i picked a game classic “pong”. We gonna learn how to code a pong in augmented reality with webgl. The result code is only 100lines!! Nice for augmented reality + webgl + a game :)

But First… What is augmented gestures ? I made augmentedgesture.js. This is a library which use getUserMedia and WebRTC to grab the webcam. It analizes the image with imageprocessing.js and extract the location of flashy balls. I presented it first at Web-5 conference with me punching Doom characters in augmented reality :) ‘Doom: a new workout for geek?’ on youtube is preview of it. For the webgl, we obviously gonna use three.js and tQuery.

Controllers for the Wii or PS3 did good as game controllers. kinect is super cool obviously. They all requires to buy specific hardware tho… So the money is a barrier. Some even require specific installation on your computer, with code to compile. This is another barrier. With augmented gestures, you dont need specific devices. I like to use objects which are cheap and readily available in our everyday life. Thus people got easily access to the content, in a pure web vibe. I use children toys that i paid 3euro per ball. Another possibility is to use post it. They work well thanks to their flashy colors as you can see in this video. They are available in most offices. Another is to use dish gloves. They are readily available and cheap.

Try it! This screencast is a presentation on how to code augmented reality pong 3D. The code is on github under MIT license. The slides of the presentation are here. Im not sure about the format of this video… the mix live coding + slides + screencast is usual. Anyway publishing it in “publish early, publish often” mood :)

Enjoy