Global Illumination (more specifically, Radiosity)

Local Reflections (calculated along Fresnel)

Physically-based Rendering

Emissive materials/area lights

Screen Space Ambient Occlusion

Dynamic Wind Simulation system

Real-time cloud formation (influenced by wind)

Rayleigh scattering/Mie Scattering

Full Volumetric Lighting

Bokeh DOF and approx. of Circle of Confusion

Sky Occlusion and Dynamic Shadow Volumes

Aperture Based Lens Flares

Sub-surface Scattering

Dynamically Localized Lightning Illumination

Per-Pixel Sky Irradiance

Fog inscatter

Particle Lights

Puddle formation and evaporation

Global Illumination/Radiosity

Real-time Local Reflections

Specular Lighting

Aperture mapped reflections

Screen Space Reflections

Physically-based Rendering

Emissive materials and area lights

Screen Space Ambient Occlusion

Dynamic Wind Simulation System

Real-time cloud formation

Rayleigh Scattering/Mie Scattering

Full Volumetric Lighting

Bokeh DOF and approx. of Circle of Confusion

Sky Occlusion and Dynamic Shadow Volumes

Aperture-based Lens Flares

Sub-surface Scattering

Dynamically Localized Lightning Illumination

Per-Pixel Sky Irradiance

Fog Inscatter

Particle Lights

Puddle formation and evaporation

Posted on behalf of brainchild courtesy of the Adopt-A-User program, Gaming Edition.NOTE: Do you also have a thread you felt like that is worth posting, but you lack the posting capabilities to do so? Give the volunteers at Adopt-A-User a call, and we will assess whether or not your thread is deemed worthy of ERA ;)NOTE 2: The subject of this topic isA certain level of understanding in video game technology is kinda required to make sense of what is being described. Please keep the discussion civil and to the point. I am just providing a service here and i'd like to keep a clean record on these. Thank you in advance and happy posting!Now, on to the thread!I've actually wanted to do a proper tech analysis of BOTW's engine for quite some time, but never really got around to it. However, with the new video capture feature on the Switch, I thought it would be the perfect opportunity to revisit the title and share my findings through videos that I've uploaded to Twitter.I'll start off with a summary of my findings, but I'll also do a breakdown of each technical feature later in this post in order to keep things accessible. Whenever possible, I'm going to try to avoid redundancies. For example, if someone else like Digital Foundry already covered a feature of the engine, I'm not going to bother covering it here. The purpose of this post (as with my SMO post), is to bring more exposure to the technical accomplishments in games where no one else bothered to even investigate.Anyway, here's a summary of the engine's features:First off, I just want to say that all real time global Illumination solutions are faked in one way or another, with varying degrees of accuracy. So anyone trying to dismiss the global illumination solution in BOTW simply because it doesn't use path tracing or something similar should really think about what they're saying. The important part to take away from this is that it is being rendered in real time; it's not just lighting that has been baked into the textures, which is pretty impressive for an open world game (especially on Wii U).Now, what exactly is Radiosity? Well, in 3D graphics rendering, it is a global illumination approximation of light bouncing from different surfaces, and transferring the color information from one surface to another, along the process. The more accurate the Radiosity, the more light bounces will need to be calculated in order to transfer the proper amount of color.In Breath of the Wild, the engine uses light probes throughout the environment to collect color information about the different surfaces located near the light probe. There is no simulation of light bounces, just some approximations of what general colors should be coming from a given area. The exact algorithm BOTW uses to calculate this information is unclear, but my best guess is spherical harmonics or something similar, based on the color averages and localization of the Radiosity. Unlike Super Mario Odyssey, Radiosity in Breath of the Wild is pretty granular instead of being binary. The lighting information that's being calculated from the light probes appear to be streamed and tied with the LOD system at the pipeline level, which makes it pretty efficient.Initially, I assumed that the spherical harmonics probes might have been placed throughout the environment to gather color samples, as it appeared to update to a general color when Link would move throughout the environment. However, after further investigation, I now know that those general color bounces were due to the lack of color variety in the environment. When I tested the global illumination in an area with lots of different colored surfaces that were next to each other, it became clear how the GI system worked. Notice how as Link approaches the red wall, its color is transferred to all surfaces that face in the opposite direction. The same is also true for the green wall that sits directly opposite of the red wall (though it isn't as intense because the probe is closer to the red wall, and the red wall's color itself is reflecting more intensely). In fact, at any given point, this is happening in all directions. The ground transfers its color upwards, and any ceiling or colored surface directly above Link's head will transfer its color. The probe continuously samples and transfers colors (which we can think of as light bounces) dynamically, as the probe will pick up more colors due to new transfers and will have to sample them as well. Eventually the end result will stop changing in appearance because the samples nearest in proximity to the probe will have the most dominant colors, regardless of the amount of color transfers. This process is sequential but very localized and very fast. The probe has a limited range to sample from and applies these results to materials in world space. Due to such efficiency, the probe can approximate the appearance of many, many bounces of light, but it will only look accurate in the areas closest to the probe.So ever since I started analyzing this game, the one area that always seemed to leave me with my head scratching was the local reflections. There were so many cases of what seemed to be inconsistencies, so my theories were initially all over the place. I can now confidently say that I have fully solved the mystery behind how the local reflections work. Apparently, it's a THREE pronged approach, depending on the situation.Sunlight, Skylight, Lightning, points lights, and area lights fall under this category. Initially I thought that Shrines and Towers did as well (since they're emissive I assumed that they were area lights), but after seeing the very revealing artifacts that they exhibit, that can be ruled out. Not all glowing materials illuminate the environment, and Shrines and Towers can be considered to be among the ones that don't.If this term seems new to you, it's probably because it is. Based on the game's text dump, BOTW devs have internally labeled their take on UE4's Scene Capture 2D reflections. This is how the environment is reflected. The virtual camera above Link's head (specifically, the aperture) has a relatively small FOV, so when Link moves it can cause reflections (displayed in real time) to move from their proper space until the aperture takes another capture of the environment. You can see these kinds of artifacts and FOV in the videos I've included.Only materials that look laminated use this model, and those materials are exclusive to the shrines. A value in their gloss map tells the engine to use SSR specifically for these materials only. They will reflect anything on screen, which can be viewed at grazing on any material. However, these materials use the aperture map for environment reflections as well, which was one of my sources of confusion. The incongruous behavior of the reflections for these materials lead me to assumptions about the other materials outside of the Shrines. Thankfully, we have that sorted out now.Before anyone asks, no, this does not mean 'physically correct looking materials'. It is simply a methodology applied to a 3D graphics rendering pipeline where all materials (textured surfaces) uniquely influence the way that light behaves when interacting with them. That is what happens in the real world, which is why it's called Physically based rendering (a concept based on real world light physics). Different materials cause light to behave differently, which is why we can visually differentiate between different surfaces in the first place.Traditionally, rendering pipelines relied on an artist's understanding of how light interacted with different real world materials and would define the look of texture maps based on that understanding. As a result, there was a lot of inconsistency between different textured surfaces and how they compared to their counterparts in the real world (which is understandable as we can't expect artists to have encyclopedic knowledge of all the properties of all matter in the real world). With PBR, the fundamentals of light physics are part of the pipeline itself, and all textured surfaces are classified as materials that have unique properties that will cause light to behave according to those unique properties. This allows surfaces to be placed in different lighting conditions and dynamic camera angles and adjusts how light interacts with those surfaces dynamically. Artists do not have to predefine this interaction like they did with the traditional workflow. It happens automatically. Because of the efficiency of PBR, developers feel more inclined to make games where all materials have unique properties that affect light differently.In Breath of the Wild, PBR is used with a bit of artistic flair, so you might not notice that the engine even relies on such a pipeline since the textures don't necessarily look realistic. However, the BRDFs (Bi-directional Reflectance Distribution Function) used on the materials make it pretty clear that the engine uses PBR. You see, with every dynamic light source, its specular highlights (the parts of a surface where the light source itself shows as a reflection) and the reflectivity/reflectance of those highlights are dynamically generated depending on the angle of incidence (angle of incoming light rays with respect to a surface normal) and index of refraction (how much a material 'bends' light as the rays touch its surface) of whatever material the lights are interacting with. If the game was using a traditional pipeline, the distribution of those specular highlights would not be much different between wood and metal. But in this game, the production of specular highlights are completely dependent on the material that the light is interacting with.Another key element that shows that BOTW uses PBR is the Fresnel (pronounced fruh-NELL) reflections of all the materials. First of all, most games using a traditional pipeline don't even bother with Fresnel because at that point you might as well just use PBR. As I explained earlier when discussing local reflections, Fresnel reflections become visible at grazing angles (angles where incoming light is nearly parallel to the surface it's interacting with from the perspective of the observer/camera).According to the Fresnel reflection coefficient, all materials achieve 100% reflectivity at grazing angles, but the effectiveness of that reflectivity will depend on the roughness of the materials. As a result, programmers differentiate between 'reflectivity' and 'reflectance'. Some materials reflect light in all directions (diffuse materials). Even at 100% reflectivity, 100% of the light may be reflected from the total surface area, but it's not all reflected in the same direction, so the light is spread out uniformly and you don't see any specular reflections (mirror images of the surfaces' surroundings). Other materials only reflect incident light in the opposite direction the light was received (specular materials) so you will only see reflections at the appropriate angle where close to 90% of the light is reflected. The Reflectance (the effectiveness of a material's ability to reflect incident light) of both diffuse and specular materials is not always 100%, even at grazing angles, which is why you don't see perfectly specular reflections at grazing angles of all materials, even in the real world. The clarity of fresnel reflections will vary with the materials producing the reflections.This one is pretty straightforward. The materials of glowing objects provide unique light sources that light the environment in the same shape as the materials themselves. These are not point light sources that radiate in all directions, or even simple directional light sources that light in one direction. They're basically 'custom shaped' light sources. It's important to mention that only the global (sun/moon/lightning) light sources cast shadows. However, BRDF still applies to all light sources in the game.In the real world, there is a certain amount of 'ambient light' that colors the environment after light has bounced around the environment so much that it has become completely diffused. If shadows are the result of objects occluding direct sunlight, then ambient occlusion can be thought of as the result of cracks and crevices in the environment occluding ambient light.The method used in BOTW is called SSAO (screen space ambient occlusion) as it calculates the AO in screen space and is view dependent. The environment will only receive AO when it is perpendicular with respect to the camera.So this one surprised me a bit because I was not expecting it to be so robust. Basically, the physics system is tied to a wind simulation system. It's completely dynamic and affects different objects according to their respective weight values. The most prominent objects affected are the blades of grass and the procedurally generated clouds.This game does not use a traditional skybox in any sense of the word. Clouds are procedurally generated based on parameters set by the engine. They cast real-time shadows. They received light information based on the sun's position in the sky. As far as I can tell, clouds are treated as an actual material in the game. They're not volumetric, so you won't be getting any crepuscular rays or anything like that, but they're not 'skybox' clouds either. They're formation is also influenced by the wind system.In the real world, when light reaches Earth's atmosphere, it is scattered by air molecules, which results in Earth's blue sky, since the shorter wavelengths of blue light are scattered more easily than other colors of light. However, as the sun approaches the horizon, it has to pass through more of the atmosphere, resulting in most of the blue light being scattered away by the time the sunlight reaches the eye of the observer, leaving the longer wavelengths of orange and red light to reach the eye. BOTW approximates this algorithm mathematically (I actually found this out through a text dump of the game's code earlier this year!) Apparently the algorithm accounts for Mie Scattering as well, which gives fog its appearance in the sky.Honestly, had I not looked at the code from that text dump, I would have never assumed that this phenomenon was being simulated in the game. It's just so easy to fake. However, after looking at the reflections of the sky in the water, it all made sense. This scattered light is being reflected onto the entire environment in real time. A simple sky box would make that impossible.Aside from clouds in the sky, every part of the environment and every object in it has the potential to create light shafts in real time, given the right lighting conditions. The game uses SSAO to aid the effect, but the volumetric lighting is actually not view dependent. You can find out more about how the Volumetric Lighting works in the shadow volumes section of this post.Another surprising feature for an engine that I assume uses deferred lighting/shading. So I'm going to simplify things a bit because it can get really technical trying to explain why the Bokeh effect even happens in the first place in the real world. Suffice to say that as light enters the aperture (opening) of an eye/camera, the incoming rays of light begin to converge into a single point on a focal plane. As light becomes more focused on this plane, its appearance becomes sharper and smaller. As light becomes more defocused away from this plane, it becomes larger and blurrier.The Bokeh effect as it is commonly known is when the points of light that enter the camera lens take on the shape of the aperture that they entered through (like a hexagonal shape, for example). The circle of confusion is the region of focus where a human cannot distinguish between a point of light that is perfectly in focus and one that slightly out of focus. Depth of field is usually determined by the circle of confusion. What's interesting is that BOTW emulates both of these concepts when using the sheikah scope or camera rune. My guess is that it's all calculated in screen space based on the texel (texture element) data, and then applied as a post-process effect.Aside from the physics in the game, these shading features are without a doubt the most computationally taxing elements in BOTW. Here's how it all works:Even though the clouds themselves don't have any volume, they still cast (soft) shadows onto the environment. However, the sun and the scattered light from the sky Illuminate the environment dynamically, and the environment and all of the objects in it cast their own shadows according to that illumination. It wouldn't look very believable for the lighting in the environment to remain unchanged even when the sky is completely overcast with cloud cover. Nintendo has implemented Sky Occlusion to solve this problem.Using a mie scattering algorithm (mie coefficients that simulate the effect of atmospheric fog), the engine calculates how much skylight to remove from the environment based on how much fog or cloud cover is in the atmosphere. The more skylight that gets occluded from the environment, the more overcast the environment will appear. Since there is less direct illumination in occluded areas, the ambient light (diffuse, non-directional light) will play a greater role in the illumination of those areas, and all of the shadows in those areas will become softer and start to match the colors of their immediate surroundings.The engine also uses shadow volumes instead of simple shadow maps, and this is done for every shadow caster in the game. Shadow volumes are cast within a specified 3D space instead of just the surfaces and objects in an environment. Aside from the Sky Occlusion looking more believable when shadow volumes are implemented, dynamically generating shadow volumes within a 3D space also provides the benefit of full real time volumetric lighting when it's combined with atmospheric fog that can receive shadows, which is exactly what happens in BOTW.This feature will go unnoticed by probably 99% of the people who play this game, so I'm not sure that it was worth implementing, tbh.Basically, when rays from a bright light source enter a camera lens at some oblique angles, they can produce optical artifacts known as Lens Flares due to the rays internally reflecting inside of the camera elements. Most games just emulate this phenomenon by applying the flare as a post effect that appears when the camera is slightly off-center from the camera frustum; the concept of light internally reflecting within the camera itself is not even even factored into the equation.In Breath of the Wild, since the engine already emulates a camera aperture for DOF, it tracks the aperture's relative position to the sun and calculates how much lens flare should be produced, even if the sun isn't on screen. But that's not all! Cameras with lots of zooming elements are even more prone to Lens Flares and the flares will change shape and size depending on the shape/size of the aperture and level of zoom. Surprisingly, BOTW approximates these effects as well!Some surfaces are translucent (not to be confused with transparent) in the real world, meaning that light can both pass through the surface and scatter inside of it. Some examples of real world translucent surfaces would be human skin, grapes, wax, and milk. Modeling this unique behavior of light in 3D graphics is called Sub-surface Scattering or SSS. As with most real time 3D rendering solutions, programmers have come up with several methods to approximate the effect without having to simulate light bounces at the molecular level. The method used in BOTW is relatively simplistic but effective.Any surface that should have some level of translucency will have multiple layers of materials in order to produce SSS. The first layer is the internal material. This material is usually baked with lighting information that gives it a translucent look. Light travels through the material but does not actually light the material itself in real time. On top of this material is the surface material. This material is the more dominant of the two, and is what you will see in most lighting conditions.The relationship between these materials work in such a way that the dominant appearance of either material is always determined by the ratio between incident light and transmitted light. If the surface material is reflecting more light than the internal material is transmitting, then the surface material will increase in opacity in proportion to the light it's receiving. If the internal material is transmitting more light than the surface material is reflecting, then the surface material will decrease in opacity in proportion to the light it's not receiving. Balancing the opacity of the surface material according to the Incidence/Transmittance ratio is a very smart and efficient way to give materials an SSS effect.Lots of games implement the illumination of an environment by lightning as a global light source, where it flashes over the entire environment and all shadow casters cast shadows in predetermined sizes and directions.In BOTW, lightning strikes are basically big ass camera flashes, each with their own radius and intensity, and have the ability to strike anywhere on map, regardless of the players location. What's interesting about BOTW's lightning system is that shadows dynamically correspond to the intensity and location of the shadow caster's nearest lightning strike. This system is probably the coolest lightning system I've ever seen in a game.If Radiance can be thought of as the amount radiation coming from the sun, Irradiance can be thought of as the amount of that radiation that a given surface actually receives. This is a pretty important variable for scattering skylight because its absence is the main reason we can see into space at night! BOTW calculates Irradiance using an algorithm that tracks the sun's position relative to zenith and during sunsets, it starts to remove skylight, pixel by pixel, until there is no Irradiance left. Granted the sky is free of cloud cover and mie scattering, stars will start to appear in the sky, even if the sky isn't dark yet. The color gradient transitions between night and day are really impressive.In the real world, fog receives both light and shade, like a physical object. This is computationally expensive to do with computer graphics if the fog is Volumetric. BOTW gets around this by creating a fog noise pattern (similar to their ambient occlusion noise pattern, but not restricted to screen space) and applying radiance values from the sun and skylight to produce 'inscatter'. When you combine this with shadow volumes, not only do you get Volumetric Lighting, you also get fog that looks like it has volume even when it doesn't.Almost every particle in the game is emissive (glowing). Many of them illuminate the environment as well. Instead of rendering particles as objects, many particles are simply point light sources that radiate in all directions in 3D space.Probably the most bizarre but also the most clever Rendering solution in the game. Underneath the entire terrain of the game world, there exists a plane of water materials that will raise and lower to fill water basins with water when it's raining and evaporate the water when the sun comes back out. There is a foam material layer that is used depending on the water surface's relative distance from the ground. The process is pretty straightforward while also serving as yet another impressive dynamic to the game.