Scene Composition

Implementation

Clear FBO with glClearColor set to desired color of background. Draw mask over the whole framebuffer. Now we have decorative blurred "stains" all over background FBO instead of even color. After this, draw shards for background. Note that FBO has resolution of 256x256 pixels, and image is very pixelated. Blur the whole background. Now low resolution of FBO is barely noticeable.

Clear screen. Draw background from FBO texture. Draw glass shards for foreground. These draw calls are performed without writing to depth buffer because all objects are transparent.

Blurring Background Layer

Mali

Result

Idea came from some video which was inspired by design of Deus Ex: Human Revolution, namely design elements with broken glass shards. In a few days we've completed final vision of scene and implemented first version of live wallpaper in one weekend. As result of this work we proudly present Shattered Glass 3D Live Wallpaper . In this article we will explain rendering techniques used in creation of this live wallpaper.Scene is really simple. It contains a bunch of glass shards suspended mid-air. Shards are separated in two groups - closer to viewer objects in foreground and farther ones in foreground. To add more depth to scene background objects are blurred. Color of background can be changed. Camera moves according to left-right swipes on screen, adding movement to scene.Technically this is implemented the following way: Glass shards are placed around camera filling shape of cylinder. Shards placed farther than certain distance are rendered in separate texture and blurred. Colored background is rendered together with background. To combine background and foreground first is drawn plane with background objects (already blurred) and then foreground shards are drawn over them.In such way we've implemented simplified depth-of-field (DOF) effect which is enough for described scene. Full implementation of real DOF is a bit more complex and computation-heavy. It consists of rendering the full scene into two separate textures - one full-sized and one of smaller resolution and blurred and depth map of scene. After that it is needed to draw to screen both normal and blurred scenes blending them according to depth map and camera focal parameters.Because all objects in scene are transparent all rendering is performed with various blending modes. Writing to depth buffer is disabled to prevent objects from cull each other. There are not that much of objects in scene so amount of overdraw is not significant. To create reflections on glass shards we use a small (128x128 pixels) cubemap texture.Order of drawing background objects:Order of drawing foreground:Blur is implemented by drawing source image between two framebuffers a few times with special shader. To keep shader simple, it can apply only either vertical or horizontal blur of image at a time, this behavior depends on values of uniforms. Such technique is called. It works this way: first texture from framebuffer A is rendered into framebuffer B with horizontal blur, and then backwards from B to A with vertical blur. This can be repeated in a few iterations to achieve necessary quality of blurred image.Notable is that modern phones and tablets (and surprisingly, quite old hardware, too) can perform not only single but multiple blur passes fast enough to keep frame rate high. Practically, Mali T604 of Nexus 10 can achieve steady 40-50 fps even with 6-8 passes of blurring 256x256 texture, and by pass I mean full horizontal + vertical blur.To achieve balance between performance on low-end devices and image quality we decided to use 3 blur iterations with 256x256 pixels resolution of render target.In our previous post we've made some not very appealing notes regarding nVidia's Tegra3 GPUs. This is not because we dislike nVidia for no reason - the same way we don't like any other GPU manufacturers providing buggy OpenGL drivers. For example, we've encountered a strange misbehavior of Nexus 10 OpenGL drivers while developing this live wallpaper. Problem is that rendering to texture is incorrect but only if orientation of device is different from default landscape. It is more than weird to see that orientation of device can affect rendering into external render target but it is present in current OpenGL drivers of Nexus 10. Strange but true.First thought was that I've missed some small part of GL context initialization. So I've decided to ask about this on Stack Overflow: http://stackoverflow.com/questions/17403197/nexus-10-render-to-external-rendertarget-works-only-in-landscape . And that's the part where I should praise the work of ARM support team. In a couple of days I've received an email from ARM engineer with proposition to file this issue toforum. I've prepared a simple demo app to reproduce issue and filed an issue here: http://forums.arm.com/index.php?/topic/16894-nexus-10-render-to-external-rendertarget-works-only-in-landscape/page__gopid__41612 . And in a mere 4 days I've received a response informing about a bug in current version of Nexus 10 drivers. Even more than that, ARM have proposed a workaround which magically resolved my issue - you should callafterThere is an issue in Google bug tracker too: http://code.google.com/p/android/issues/detail?id=57391 Unfortunately, it doesn't have any definite official answer from Google yet. If you wish to improve quality of Nexus 10 software please star this issue to drag Google attention to it.