Soft Particles

So what are soft particles? You may remember that in most older games (Quake 3 and CS 1.6 times) smoke and explosion effects had clearly visible hard edges at intersections of particles with other geometries. All modern games got rid of this by using particles with soft edges around adjacent geometry.

Rendering

What is needed to make particle edges soft? First, we need to have a scene depth information for particles shader to detect intersections and soften them. Then we will be able to detect exact places where particles intersect with geometry by comparing depth of scene and particle in fragment shader — intersection is where these depth values are equal. Let’s go through the rendering pipeline step-by-step. Both Android OpenGL ES and WebGL implementations of rendering are the same, the main difference is in loading resources. WebGL implementation is open-sourced and you can get it here — https://github.com/keaukraine/webgl-buddha.

Rendering to depth texture

To render scene depth, we first need to create off-screen depth and colors textures and assign them to corresponding FBO. This stuff is done in the initOffscreen() method of BuddhaRenderer.js.

Actual rendering of depth scene objects is done in drawDepthObjects() which draws a Buddha statue and a floor plane. However, there’s one trick here. Since we don’t need color information but only depth, rendering of color is disabled by gl.colorMask(false, false, false, false) call, and then re-enabled by gl.colorMask(true, true, true, true) . glcolorMask() can toggle rendering of red, green, blue, and alpha components individually so to completely skip writing to the color buffer we set all components to false and then re-enable them by setting them to true. Result depth information of the scene can be viewed by uncommenting the call to drawTestDepth() in drawScene() method. Because depth texture is single-channel, it is treated as red-only so green and blue channels have zero values. Result looks like this if visualized:

Scene depth visualized

Rendering particles

Shader used for rendering of soft particles can be found in SoftDiffuseColoredShader.js. Let’s take a look at how it works.

The main idea of detecting intersection between particle and scene geometry is by comparing fragment depth with scene depth which is stored in texture.

First thing needed to compare depth is linearization of depth values because original values are exponential. This is done using the calc_depth() function. This technique is described here — https://community.khronos.org/t/soft-blending-do-it-yourself-solved/58190. To linearize these values we need vec2 uCameraRange uniform which x and y components have near and far planes. Then shader calculates the linear difference between particle geometry and scene depth — it is stored in a variable. However, if we apply this coefficient to particle color we will get too dim particles — they will linearly fade away from any geometries behind them, and this fade is quite fast. This is how linear depth difference looks when visualized (you can uncomment corresponding line in shader to see it):

Linear depth difference

To make particles more transparent only near the intersection edge (which happens at a=0) we apply GLSL smoothstep() function to it with uTransitionSize coefficient which defines the size of a soft edge. If you want to understand how smoothstep() function works and see some more cool examples on how to use it, you should read this great article — http://www.fundza.com/rman_shaders/smoothstep/. This final blending coefficient is stored in a variable simply named b. For blending mode used by our particles we simply multiply a particle’s diffuse color by this coefficient, in other implementations it may be applied to the alpha channel. If you uncomment line in shader to visualize this coefficient you will see image similar to this one:

Applied smoothstep()

Here you can see visual difference between different values of particle softness uniform:

Different particles softness

Sprite billboard meshes

Small dust particles are rendered as point sprites (rendering using GL_POINTS ). This mode is easy to use because it automatically creates a quad shape in the fragment shader. However, they are a bad choice for large smoke particles. First of all, they are frustum-culled by the point center and thus would disappear abruptly on screen edges. Also, quad shape is not very efficient and can add significant overdraw. We decided to use custom particle mesh with optimized shape — with cut corners where texture is completely transparent:

Optimized particle mesh geometry