For the last few weeks I’ve been working on ways to get realistic shading in an environement as large as a planet while maintaining as much details in the near view as possible. In order to achieve this, I use Physically Based Rendering (PBR) for the light shading model, and combine it with the values supplied by the precomputed atmosphere. Additionally, a global/local volumetric environement probe system is used to seamlessly provide information for Image Based Lighting (IBL) in real time.

Volumetric Deferred Lights:

When using forward rendering, the shading cost is usually related the number of lights present in the scene. In constrast, when using deferred rendering, the shading cost is shifted to the rendering resolution, since we now store the vertex data in textures. For most operations in deferred rendering, a screen quad mesh is used to process a texture, this make sure that every pixels in the screen is processed. In order to reduce the shading cost, it’s possible to draw basic shapes instead of a screen quad, and use projective mapping to perform the textures lookup instead. (You can view the PBR common shader here, and the point light shader here)

This would be a regular deferred pass using a screen quad.

-- vertex shader layout(location = 0 ) in vec2 vs_Position; layout(location = 1 ) in vec2 vs_TexCoord; out vec2 fs_TexCoord; void main() { gl_Position = vec4 (vs_Position, 0 , 1 ); fs_TexCoord = vs_TexCoord; } -- fragment shader layout(location = 0 ) out vec4 frag; in vec2 fs_TexCoord; void main() { float depth = texture(s_Tex0, fs_TexCoord).r; vec4 albedo = texture(s_Tex1, fs_TexCoord); vec3 wpos = GetWorldPos(fs_TexCoord, depth); frag = vec4 (albedo.rgb, 1 ); }

And this would be a deferred volumetric pass using a cube mesh.

-- vertex shader layout(location = 0 ) in vec3 vs_Position; out vec3 fs_ProjCoord; void main() { gl_Position = u_ViewProjMatrix * u_ModelMatrix * vec4 (vs_Position.xyz, 1 ); fs_ProjCoord.x = (gl_Position.x + gl_Position.w) * 0.5 ; fs_ProjCoord.y = (gl_Position.y + gl_Position.w) * 0.5 ; fs_ProjCoord.z = gl_Position.w; } -- fragment shader layout(location = 0 ) out vec4 frag; in vec3 fs_ProjCoord; void main() { float depth = textureProj(s_Tex0, fs_ProjCoord).r; vec4 albedo = textureProj(s_Tex1, fs_ProjCoord); vec2 uv = fs_ProjCoord.xy / fs_ProjCoord.z; vec3 wpos = GetWorldPos(uv, depth); frag = vec4 (albedo.rgb, 1 ); }

Volumetric Environement Probes:

For this approach, the environement probes are treated as another type of light, just like a point, a spot or an area light. It consist of two parts, a global cubemap, and a list of smaller parallax corrected cubemaps. The global cubemap is generated first and contains the sky, sun and clouds lighting information. Next I generate the local cubemaps, but change the clear color to transparent so that they can be blended later on, at this point all the information is generated and ready to be drawn. For the actual drawing, I use a screen quad volume for the global cubemap, and a box volume for the local cubemaps. First I clear all the buffers and draw the local volumes, then I draw the global volume while making sure to skip the pixels already shaded using a stencil buffer. This works but the local cubemaps still shades pixel outside of it’s range, to fix this I discard the pixel if the reconstructed world position is outside of the volume range. Finally, in the local volume passes, I blend the local cubemap with the global one using it’s alpha channel. (You can view the render pipeline object here, the envprobe script object here, and the envprobe shader here)

Procedural Terrain Shading:

Now that the IBL information is ready, it’s time to actually shade the terrain. First I generate a splatmap using information such as the terrain slope and range. The detail color and normal textures are loaded from memory and stored in texture arrays. To improve the quality, they are mipmapped and use anisotropic and linear filtering. Several different techniques are used to shade the terrain such as normal mapping, height and distance based blending and Parallax Occlusion Mapping (POM) for the rocks. (You can view the tile producer script object here, the splatmap shader here, the planet script object here, and the planet shader here)

Tessellation:

While the planet is still using a quadtree for the tile generation and such, tessellation is now used for the actual mesh rendering. This is needed to boost the amount of polygons close to the player camera, and fixes some collision mismatch I had when generating the tile colliders. It’s also very useful to control the terrain quality based on the GPU capabilities. (You can view the planet shader here)

Conclusion:

I also did a lot of work around model loading, I’m using the gltf pipeline to generate binary models, and added the abilities to create collider directly from the vertices/indices buffer, meaning it’s now possible to stream large models as they load almost instantly.

References:

[1] Encelo’s Blog (Volumetric Lights)

[2] Cinder-Experiments (Parallax Corrected Cubemap)

[3] Asylum_Tutorials (Physically Based Rendering)

[4] Proland (Planet Rendering)