This is part 18 of a tutorial series about rendering. After wrapping up baked global illumination in part 17, we move on to supporting realtime GI. After that, we'll also support light probe proxy volumes and cross-fading LOD groups.

From now on, this tutorial series is made with Unity 2017.1.0f3. It won't work with older versions, because we'll end up using a new shader function.

Realtime Global Illumination

Baking light works very well for static geometry, and also pretty well for dynamic geometry thanks to light probes. However, it cannot deal with dynamic lights. Lights in mixed mode can get away with some realtime adjustments, but too much makes it obvious that the baked indirect light doesn't change. So when you have an outdoor scene, the sun has to be unchanging. It cannot travel across the sky like it does in real life, as it requires gradually changing GI. So the scene has to be frozen in time.

To make indirect lighting work with something like a moving sun, Unity uses the Enlighten system to calculate realtime global illumination. It works like baked indirect lighting, except that the lightmaps and probes are computed at runtime.

Figuring out indirect light requires knowledge of how light could bounce between static surfaces. The question is which surfaces are potentially affected by which other surfaces, and to what degree. Figuring out these relationships is a lot of work and cannot be done in realtime. So this data is processed by the editor and stored for use at runtime. Enlighten then uses it to compute the realtime lightmaps and probe data. Even then, it's only feasible with low resolution lightmaps.

Enabling Realtime GI Realtime global illumination can be enabled independent of baked lighting. You can have none, one, or both active at the same time. It is enabled via the checkbox in the Realtime Lighting section of the Lighting window. Both realtime and baked GI enabled. To see realtime GI in action, set the mode of the main light in our test scene to realtime. As we have no other lights, this effectively turns off baked lighting, even when it is enabled. Realtime main light. Make sure that all objects in the scene use our white material. Like last time, the spheres are all dynamic and everything else is static geometry. Only dynamic objects receive realtime GI. It turns out that only the dynamic objects benefit from realtime GI. The static objects have become darker. That's because the light probes automatically incorporated the realtime GI. Static objects have to sample the realtime lightmaps, which are not the same as the baked lightmaps. Our shader doesn't do this yet.

Baking Realtime GI Unity already generates the realtime lightmaps while in edit mode, so you can always see the realtime GI contribution. These maps are not retained when switching between edit and play mode, but they end up the same. You can inspect the realtime lightmaps via the Object maps tab of the Lighting window, with a lightmap-static object selected. Choose the Realtime Intensity visualization to see the realtime lightmap data. Realtime lightmap, with roof selected. Although realtime lightmaps are already baked, and they might appear correct, our meta pass actually uses the wrong coordinates. Realtime GI has its own lightmap coordinates, which can end up being different than those for static lightmaps. Unity generates these coordinates automatically, based on the lightmap and object settings. They are stored in the third mesh UV channel. So add this data to VertexData in My Lightmapping. struct VertexData { float4 vertex : POSITION; float2 uv : TEXCOORD0; float2 uv1 : TEXCOORD1; float2 uv2 : TEXCOORD2; }; Now MyLightmappingVertexProgram has to use either the second or third UV set, together with either the static or dynamic lightmap's scale and offset. We can rely on the UnityMetaVertexPosition function to use the right data. Interpolators MyLightmappingVertexProgram (VertexData v) { Interpolators i; // v.vertex.xy = v.uv1 * unity_LightmapST.xy + unity_LightmapST.zw; // v.vertex.z = v.vertex.z > 0 ? 0.0001 : 0; // // i.pos = UnityObjectToClipPos(v.vertex); i.pos = UnityMetaVertexPosition( v.vertex, v.uv1, v.uv2, unity_LightmapST, unity_DynamicLightmapST ); i.uv.xy = TRANSFORM_TEX(v.uv, _MainTex); i.uv.zw = TRANSFORM_TEX(v.uv, _DetailTex); return i; } What does UnityMetaVertexPosition look like? It does what we used to do, except it uses flags made available via unity_MetaVertexControl to decide which coordinate sets and lightmaps to use. float4 UnityMetaVertexPosition ( float4 vertex, float2 uv1, float2 uv2, float4 lightmapST, float4 dynlightmapST ) { if (unity_MetaVertexControl.x) { vertex.xy = uv1 * lightmapST.xy + lightmapST.zw; // OpenGL right now needs to actually use incoming vertex position, // so use it in a very dummy way vertex.z = vertex.z > 0 ? 1.0e-4f : 0.0f; } if (unity_MetaVertexControl.y) { vertex.xy = uv2 * dynlightmapST.xy + dynlightmapST.zw; // OpenGL right now needs to actually use incoming vertex position, // so use it in a very dummy way vertex.z = vertex.z > 0 ? 1.0e-4f : 0.0f; } return UnityObjectToClipPos(vertex); } Note that the meta pass is used for both baked and realtime lightmapping. So when realtime GI is used, it will also be included in builds.

Sampling Realtime Lightmaps To actually sample the realtime lightmaps, we have to also add the third UV set to VertexData in My Lighting. struct VertexData { float4 vertex : POSITION; float3 normal : NORMAL; float4 tangent : TANGENT; float2 uv : TEXCOORD0; float2 uv1 : TEXCOORD1; float2 uv2 : TEXCOORD2; }; When a realtime lightmap is used, we have to add its lightmap coordinates to our interpolators. The standard shader combines both lightmap coordinate sets in a single interpolator – multiplexed with some other data – but we can get away with separate interpolators for both. We know that there is dynamic light data when the DYNAMICLIGHTMAP_ON keyword is defined. It's part of the keyword list of the multi_compile_fwdbase compiler directive. struct Interpolators { … #if defined(DYNAMICLIGHTMAP_ON) float2 dynamicLightmapUV : TEXCOORD7; #endif }; Fill the coordinates just like the static lightmap coordinates, except with the dynamic lightmap's scale and offset, made available via unity_DynamicLightmapST . Interpolators MyVertexProgram (VertexData v) { … #if defined(LIGHTMAP_ON) || ADDITIONAL_MASKED_DIRECTIONAL_SHADOWS i.lightmapUV = v.uv1 * unity_LightmapST.xy + unity_LightmapST.zw; #endif #if defined(DYNAMICLIGHTMAP_ON) i.dynamicLightmapUV = v.uv2 * unity_DynamicLightmapST.xy + unity_DynamicLightmapST.zw; #endif … } Sampling the realtime lightmap is done in our CreateIndirectLight function. Duplicate the #if defined(LIGHTMAP_ON) code block and make a few changes. First, the new block is based on the DYNAMICLIGHTMAP_ON keyword. Also, it should use DecodeRealtimeLightmap instead of DecodeLightmap , because the realtime maps use a different color format. Because this data might be added to baked lighting, don't immediately assign to indirectLight.diffuse , but use an intermediate variable which is added to it at the end. Finally, we should only sample spherical harmonics when neither a baked nor a realtime lightmap is used. #if defined(LIGHTMAP_ON) indirectLight.diffuse = DecodeLightmap(UNITY_SAMPLE_TEX2D(unity_Lightmap, i.lightmapUV)); #if defined(DIRLIGHTMAP_COMBINED) float4 lightmapDirection = UNITY_SAMPLE_TEX2D_SAMPLER( unity_LightmapInd, unity_Lightmap, i.lightmapUV ); indirectLight.diffuse = DecodeDirectionalLightmap( indirectLight.diffuse, lightmapDirection, i.normal ); #endif ApplySubtractiveLighting(i, indirectLight); // #else // indirectLight.diffuse += max(0, ShadeSH9(float4(i.normal, 1))); #endif #if defined(DYNAMICLIGHTMAP_ON) float3 dynamicLightDiffuse = DecodeRealtimeLightmap( UNITY_SAMPLE_TEX2D(unity_DynamicLightmap, i.dynamicLightmapUV) ); #if defined(DIRLIGHTMAP_COMBINED) float4 dynamicLightmapDirection = UNITY_SAMPLE_TEX2D_SAMPLER( unity_DynamicDirectionality, unity_DynamicLightmap, i.dynamicLightmapUV ); indirectLight.diffuse += DecodeDirectionalLightmap( dynamicLightDiffuse, dynamicLightmapDirection, i.normal ); #else indirectLight.diffuse += dynamicLightDiffuse; #endif #endif #if !defined(LIGHTMAP_ON) && !defined(DYNAMICLIGHTMAP_ON) indirectLight.diffuse += max(0, ShadeSH9(float4(i.normal, 1))); #endif Realtime GI applied to everything. Now realtime lightmaps are used by our shader. Initially, it might look the same as baked lighting with a mixed light, when using distance shadowmask mode. The difference becomes obvious when turning off the light while in play mode. Indirect light remains after disabling mixed light. After disabling a mixed light, its indirect light will remain. In contrast, the indirect contribution of a realtime light disappears – and reappears – as it should. However, it might take a while before the new situation is fully baked. Enlighten incrementally adjust the lightmaps and probes. How quickly this happens depends on the complexity of the scene and the Realtime Global Illumination CPU quality tier setting. Toggling realtime light with realtime GI. All realtime lights contribute to realtime GI. However, its typical use is with the main direction light only, representing the sun as it moves through the sky. It is fully functional for directional lights. Points lights and spotlights work too, but only unshadowed. So when using shadowed point lights or spotlights you can end up with incorrect indirect lighting. Realtime spotlight with unshadowed indirect light. If you want to exclude a realtime light from realtime GI, you can do so by settings its Indirect Multiplier for its light intensity to zero.

Emissive Light Realtime GI can also be used for static objects that emit light. This makes it possible to vary their emission with matching realtime indirect light. Let's try this out. Add a static sphere to the scene and give it a material that uses our shader with a black albedo and white emission color. Initially, we can only see the indirect effects of the emitted light via the static lightmaps. Baked GI with emissive sphere. To bake emissive light into the static lightmap, we had to set the material's global illumination flags in our shader GUI. As we always set the flags to BakedEmissive , the light ends up in the baked lightmap. This is fine when the emissive light is constant, but doesn't allow us to animate it. To support both baked and realtime lighting for the emission, we have to make this configurable. We can do so by adding a choice for this to MyLightingShaderGUI , via the MaterialEditor.LightmapEmissionProperty method. Its single parameter is the property's indentation level. void DoEmission () { MaterialProperty map = FindProperty("_EmissionMap"); Texture tex = map.textureValue; EditorGUI.BeginChangeCheck(); editor.TexturePropertyWithHDRColor( MakeLabel(map, "Emission (RGB)"), map, FindProperty("_Emission"), emissionConfig, false ); editor.LightmapEmissionProperty(2); if (EditorGUI.EndChangeCheck()) { if (tex != map.textureValue) { SetKeyword("_EMISSION_MAP", map.textureValue); } foreach (Material m in editor.targets) { m.globalIlluminationFlags = MaterialGlobalIlluminationFlags.BakedEmissive; } } } We also have to stop overriding the flags each time an emission property has changed. Actually, it is a bit more complicated than that. One of the flag options is EmissiveIsBlack , which indicates that computation of the emission can be skipped. This flag is always set for new materials. To make indirect emission work, we have to guarantee that this flag is not set, regardless whether we choose realtime or baked. We can do this by always masking the EmissiveIsBlack bit of the flags value. foreach (Material m in editor.targets) { m.globalIlluminationFlags &= ~MaterialGlobalIlluminationFlags.EmissiveIsBlack ; } Realtime GI with emissive sphere. The visual difference between baked and realtime GI is that the realtime lightmap usually has a much lower resolution than the baked one. So when the emission doesn't change and you use baked GI anyway, make sure to take advantage of its higher resolution. What is the purpose of EmissiveIsBlack ? It is an optimization, making it possible to skip part of the GI-baking process. However, it relies on the flag to be set only when the emission color is indeed black. As the flags are set by the shader GUI, this is determined when a material is edited via the inspector. At least, that is how Unity's standard shader does it. So if the emission color is later changed by a script or the animation system, the flag is not adjusted. This is the cause of many people not understanding why their animating emission doesn't affect realtime GI. The result is the general wisdom of not setting the emission color to pure black if you want to change it at runtime. Instead of using that approach, we use LightmapEmissionProperty , which also offers the option of turning off GI for emission entirely. So the choice is explicit for the user, without any hidden behavior. Don't use emissive light? Make sure its GI is set to None.