This tutorial is about supporting triplanar texture mapping. It uses the FXAA tutorial project as its foundation.

This tutorial is made with Unity 2017.4.1f1.

Texturing Without UV Coordinates

The usual way to perform texture mapping is by using the UV coordinates stored per-vertex in a mesh. But this is not the only way to do it. Sometimes, there are no UV coordinates available. For example, when working with procedural geometry of arbitrary shapes. When creating a terrain or cave systems at run-time, it usually isn't feasible to generate UV coordinates for an appropriate texture unwrap. In those cases, we have to use an alternative way to map textures onto our surfaces. One such way is triplanar mapping.

Up to this point, we've always assumed that UV coordinates are available. Our My Lighting Input and My Lighting shader include files depend on them. While we could create alternatives that do not depend on vertex UV, it would be more convenient if our current files could be made to work both with and without UV. This requires a few changes.

We keep the current approach as the default, but will switch to working without UV when NO_DEFAULT_UV is defined.

Doing Without Default UV When the mesh data doesn't contain UV, then we don't have any UV to pass from the vertex to the fragment program. So make the existence of the UV interpolator in My Lighting Input dependent on NO_DEFAULT_UV. struct InterpolatorsVertex { … #if !defined(NO_DEFAULT_UV) float4 uv : TEXCOORD0; #endif … }; struct Interpolators { … #if !defined(NO_DEFAULT_UV) float4 uv : TEXCOORD0; #endif … }; There are multiple functions that assume the interpolators always contain UV, so we have to make sure that they keep working and compiling. We'll do that by introducing a new GetDefaultUV function below the interpolator declarations. When no UV are available, it will simply return zeros, otherwise the regular UV. We'll also make it possible to provide an alternative approach by defining UV_FUNCTION, in case that might be useful. This works like ALBEDO_FUNCTION, except that an override has to be defined before the inclusion of My Lighting Input. float4 GetDefaultUV (Interpolators i) { #if defined(NO_DEFAULT_UV) return float4(0, 0, 0, 0); #else return i.uv; #endif } #if !defined(UV_FUNCTION) #define UV_FUNCTION GetDefaultUV #endif Now we can change all usage of i.uv with UV_FUNCTION(i) . I've only shown the change for GetDetailMask , but it applies to all getter functions. float GetDetailMask (Interpolators i) { #if defined (_DETAIL_MASK) return tex2D(_DetailMask, UV_FUNCTION(i) .xy).a; #else return 1; #endif } Moving on to My Lighting, we must make sure that all UV-related work in the vertex program is skipped when no UV are available. This applies to the texture coordinate transformation, and also the default vertex displacement approach. InterpolatorsVertex MyVertexProgram (VertexData v) { … #if !defined(NO_DEFAULT_UV) i.uv.xy = TRANSFORM_TEX(v.uv, _MainTex); i.uv.zw = TRANSFORM_TEX(v.uv, _DetailTex); #if VERTEX_DISPLACEMENT float displacement = tex2Dlod(_DisplacementMap, float4(i.uv.xy, 0, 0)).g; displacement = (displacement - 0.5) * _DisplacementStrength; v.normal = normalize(v.normal); v.vertex.xyz += v.normal * displacement; #endif #endif … } The parallax effect also relies on default UV, so skip it when UV are not available. void ApplyParallax (inout Interpolators i) { #if defined(_PARALLAX_MAP) && !defined(NO_DEFAULT_UV) … #endif }

Collecting Surface Properties Without UV, there must be another way to determine the surface properties used for lighting. To make this as generic as possible, our include files shouldn't care how these properties are obtained. All we need is a universal way to provide surface properties. We can use an approach akin to Unity's surface shaders, relying on a function to set all surface properties. Create a new MySurface.cginc include file. In it, define a SurfaceData struct that contains all surface properties needed for lighting. That's albedo, emission, normal, alpha, metallic, occlusion, and smoothness. #if !defined(MY_SURFACE_INCLUDED) #define MY_SURFACE_INCLUDED struct SurfaceData { float3 albedo, emission, normal; float alpha, metallic, occlusion, smoothness; }; #endif We put it in a separate file, so other code can use it before including any other files. But our files will rely on it as well, so include it in My Lighting Input. #include "UnityPBSLighting.cginc" #include "AutoLight.cginc" #include "MySurface.cginc" In My Lighting, setup a new SurfaceData surface variable with the default functions, at the beginning of MyFragmentProgram , after ApplyParallax and before alpha is used. Then change the alpha code to rely on surface.alpha instead of invoking GetAlpha . Also move InitializeFragmentNormal so the normal vector is handled before the surface is configured. FragmentOutput MyFragmentProgram (Interpolators i) { UNITY_SETUP_INSTANCE_ID(i); #if defined(LOD_FADE_CROSSFADE) UnityApplyDitherCrossFade(i.vpos); #endif ApplyParallax(i); InitializeFragmentNormal(i); SurfaceData surface; surface.normal = i.normal; surface.albedo = ALBEDO_FUNCTION(i); surface.alpha = GetAlpha(i); surface.emission = GetEmission(i); surface.metallic = GetMetallic(i); surface.occlusion = GetOcclusion(i); surface.smoothness = GetSmoothness(i); float alpha = surface.alpha ; #if defined(_RENDERING_CUTOUT) clip(alpha - _Cutoff); #endif // InitializeFragmentNormal(i); … } Now rely on surface instead of invoking the getter functions again when determining the fragment's color. float3 albedo = DiffuseAndSpecularFromMetallic( surface.albedo , surface.metallic , specularTint, oneMinusReflectivity ); … float4 color = UNITY_BRDF_PBS( albedo, specularTint, oneMinusReflectivity, surface.smoothness , i.normal, viewDir, CreateLight(i), CreateIndirectLight(i, viewDir) ); color.rgb += surface.emission ; And when filling the G-buffers for deferred rendering. #if defined(DEFERRED_PASS) #if !defined(UNITY_HDR_ON) color.rgb = exp2(-color.rgb); #endif output.gBuffer0.rgb = albedo; output.gBuffer0.a = surface.occlusion ; output.gBuffer1.rgb = specularTint; output.gBuffer1.a = surface.smoothness ; output.gBuffer2 = float4(i.normal * 0.5 + 0.5, 1); output.gBuffer3 = color; … #endif The CreateIndirectLight function also used the getter functions, so add a SurfaceData parameter to it and use that instead. UnityIndirect CreateIndirectLight ( Interpolators i, float3 viewDir , SurfaceData surface ) { … #if defined(FORWARD_BASE_PASS) || defined(DEFERRED_PASS) … float3 reflectionDir = reflect(-viewDir, i.normal); Unity_GlossyEnvironmentData envData; envData.roughness = 1 - surface.smoothness ; … float occlusion = surface.occlusion ; … #endif return indirectLight; } Then add surface as an argument to its invocation in MyFragmentProgram . CreateLight(i), CreateIndirectLight(i, viewDir , surface )

Customized Surfaces To make it possible to change how the surface data is obtained, we'll again allow the definition of a custom function. This function needs input to work with. By default, that would be the UV coordinates, both the main and detail UV packed in a single float4 . Alternative inputs could be a position and a normal vector. Add a SurfaceParameters struct to our Surface file that contains all these inputs. struct SurfaceData { float3 albedo, emission, normal; float alpha, metallic, occlusion, smoothness; }; struct SurfaceParameters { float3 normal, position; float4 uv; }; Back in My Lighting , adjust MyFragmentProgram so it uses a different way to setup the surface data when a SURFACE_FUNCTION is defined. When this is the case, fill surface with the normal vector and set all other values to their default. Then create the surface parameters and invoke the custom surface function. Its arguments are the surface—as an inout parameter—and the parameters struct. SurfaceData surface; #if defined(SURFACE_FUNCTION) surface.normal = i.normal; surface.albedo = 1; surface.alpha = 1; surface.emission = 0; surface.metallic = 0; surface.occlusion = 1; surface.smoothness = 0.5; SurfaceParameters sp; sp.normal = i.normal; sp.position = i.worldPos.xyz; sp.uv = UV_FUNCTION(i); SURFACE_FUNCTION(surface, sp); #else surface.normal = i.normal; surface.albedo = ALBEDO_FUNCTION(i); surface.alpha = GetAlpha(i); surface.emission = GetEmission(i); surface.metallic = GetMetallic(i); surface.occlusion = GetOcclusion(i); surface.smoothness = GetSmoothness(i); #endif As it might be possible that SURFACE_FUNCTION changes the surface normal, assign it back to i.normal afterwards. That way we don't need to change all the code that uses i.normal . SurfaceData surface; #if defined(SURFACE_FUNCTION) … #else … #endif i.normal = surface.normal;

No Tangent Space Note that unlike Unity's surface shader approach, we're working with a normal vector in world space, not tangent space. If we want to use tangent-space normal mapping in SURFACE_FUNCTION, then we have to explicitly do this ourselves. We could also support more configuration options about how the normal should be treated both before and after invoking SURFACE_FUNCTION, but we won't do that in this tutorial. What we will do is make it possible to turn off the default tangent-space normal mapping approach. This saves work when tangents aren't used. We'll do this by only turning on tangent space when the default normal mapping or parallax mapping is active. Indicate this in My Lighting Input with a convenient REQUIRES_TANGENT_SPACE macro. #if defined(_NORMAL_MAP) || defined(_DETAIL_NORMAL_MAP) || defined(_PARALLAX_MAP) #define REQUIRES_TANGENT_SPACE 1 #define TESSELLATION_TANGENT 1 #endif #define TESSELLATION_UV1 1 #define TESSELLATION_UV2 1 Now we only have to include the tangent and binormal vector interpolators when needed. struct InterpolatorsVertex { … #if REQUIRES_TANGENT_SPACE #if defined(BINORMAL_PER_FRAGMENT) float4 tangent : TEXCOORD2; #else float3 tangent : TEXCOORD2; float3 binormal : TEXCOORD3; #endif #endif … }; struct Interpolators { … #if REQUIRES_TANGENT_SPACE #if defined(BINORMAL_PER_FRAGMENT) float4 tangent : TEXCOORD2; #else float3 tangent : TEXCOORD2; float3 binormal : TEXCOORD3; #endif #endif … }; In My Lighting, we could skip setting up these vectors in MyVertexProgram . InterpolatorsVertex MyVertexProgram (VertexData v) { … #if REQUIRES_TANGENT_SPACE #if defined(BINORMAL_PER_FRAGMENT) i.tangent = float4(UnityObjectToWorldDir(v.tangent.xyz), v.tangent.w); #else i.tangent = UnityObjectToWorldDir(v.tangent.xyz); i.binormal = CreateBinormal(i.normal, i.tangent, v.tangent.w); #endif #endif … } And without tangent space, InitializeFragmentNormal is reduced so simply normalizing the interpolated normal. void InitializeFragmentNormal(inout Interpolators i) { #if REQUIRES_TANGENT_SPACE float3 tangentSpaceNormal = GetTangentSpaceNormal(i); #if defined(BINORMAL_PER_FRAGMENT) float3 binormal = CreateBinormal(i.normal, i.tangent.xyz, i.tangent.w); #else float3 binormal = i.binormal; #endif i.normal = normalize( tangentSpaceNormal.x * i.tangent + tangentSpaceNormal.y * binormal + tangentSpaceNormal.z * i.normal ); #else i.normal = normalize(i.normal); #endif }