Today we welcome the team at Cloud Imperium Games, as they go into the details about the Substance workflow they developed for the making of the upcoming space game Star Citizen. The amount of texturing work for the planets and terrains is huge, so we’ll discuss how they managed to meet both the technical and artistic goals in their texturing pipeline.

Introductions

Michel: Hi, my name is Michel Kooper and I’m Lead Environment Artist in our Frankfurt office. I’ve been working in games for a little over 10 years now. Before joining Cloud Imperium Games to set up the German environment team I was at Crytek for five years, working on games like Crysis, Ryse, Homefront 2, and Hunt: Showdown.

I joined the company in January 2017 and have been working as Lead Environment Artist in the German office, building up and running the new environment team and working with the engineers on our planet tech and creation pipeline for organic/natural assets.

Sebastian: Hello, I am Sebastian Schröder and one of the Senior Environment Artists here. I’ve spent most of my career as a freelance environment artist in the past nine years working on various titles.

I joined the company in April 2017, and my role has since developed into being mostly focused on improving our planet-related technology and pipelines with custom tools and workflows.

Star Citizen

Michel: With Star Citizen, we are creating a universe that combines the freedom of exploration, the thrill of combat, and the unique challenge of building a life in space. We are aiming to put ultimate control in the hands of the player. Whether making their way as a cargo hauler, charting the great unknown, or scraping out a living outside the law, players will navigate through a mixture of procedurally generated and handcrafted worlds, interacting with a variety of compelling characters.

Substance

Michel: I first got introduced to Substance at GDC 2014 in San Francisco, where I viewed a booth demo on the expo floor. Although interesting, at that point the software and workflow felt very abstract to me. Later that year, in December, I purchased a Substance Designer indie license and started digging in. Haven’t looked back since.

Sebastian: I don’t remember how and when I discovered Substance Designer, but I’ve been using it ever since Substance Designer 4 was part of the Christmas sale on Steam in 2013.

Michel: we decided to use the Substance toolset as it is incredibly empowering to artists. A lot of what we do in Star Citizen is modular and, to a certain degree, benefitting from some level of proceduralism. This makes Substance a great match for our pipeline, as it feels it shares a lot of the philosophy.

We also quite like the fact that, beyond creating final materials, it also allows more tech-savvy team members to use it more as a tech art tool and build custom nodes that help in our pipeline and simplify steps in the creation process.

Our pipeline and tech are constantly improving and refining; the non-destructive nature of Substance allows us to simply make changes or adjustments to existing content without too much effort. This way we can have a lot of content already live and playable for our players, while retaining the flexibility to change or improve the content along the way.

Our Workflow

Michel: A lot of our pipeline still uses a mix of the more traditional tools, like 3ds Max for modeling and high-res sculpting in ZBrush, but we are also very keen on newer and smarter tools, like texturing in Substance or using SpeedTree to help with the creation of vegetation assets. We are also working on automating elements of our pipeline by using Houdini, potentially offloading a chunk of the modeling work for geology assets to Houdini tools.

On the Organics side of the team, our work is split into a few main categories: vegetation, geology, and the planet itself, which covers global views all the way down to terrain and ground layers.

Pipeline Examples

Vegetation:

For our vegetation, we use a combination of SpeedTree, ZBrush, and Substance Painter. We start out breaking down a vegetation asset into the leaves/flowers, other alpha components, and the branch/bark components. We then build a white box version of this alpha/leaves texture, using either photo references or mockup shapes that represent the leaf shapes. We use this as a start in SpeedTree to build the asset. This way we can validate whether the asset is going to hit our visual goals, the right scale, shape, leaf density, etc. Just like everything Substance, this is a non-destructive, node-based workflow.

When we are satisfied with the direction of the asset and the layout of the alpha/leaves sheet, we move our mockup layout into ZBrush and start replacing the leaf mockup shapes with actual 3D modeled/sculpted leaves/flowers and other components. After this sculpting phase, we move the sheet into Substance Painter to bake and texture it. We felt Substance Painter was a great match for these assets as it allows us to use a lot of easy generators and get nice base layers and details with just a few clicks — and still have the option to paint in unique details by hand.

For our bark, we use tiling materials made in Substance Designer, together with our updated leaves sheet. The final version of the material is set up and the model exported and brought into our engine.

We follow this process for our entire library.

Terrain materials:

We build our terrain materials in sets, covering a range of textures describing variations that fit together; for example, desert sand, going from smooth to wavy, to covered with rocks, to densely covered by rocks. The set will share a lot of common elements, but we use the flexibility of Substance to easily create variations.

An example of this is the basic_sand node we have in our library. This simple node holds a variety of sand types built in Substance using scan data as a reference point for our PBR values. Each of the presets has been validated in our engine and has all the outputs needed to be plugged into other graphs.

We have various preset nodes like these and often use a mix of different settings to create a nice blend of sand and surface types. This speeds up the creation process, eliminates a lot of the guess and tweak work, and maintains consistency between material sets. By tweaking the presets of these nodes throughout the graph, it has become quite easy to generate variations without rebuilding the graphs themselves.

Below are some examples of ground material sets we created with this method.

Terrains

We use Substance Designer to create our terrain height maps. These terrain height maps are 2k resolution and describe about 4 by 4 kilometers. The biggest difference from regular terrain maps is our terrain maps tile. We cover our planets with these tiling terrains and blend between them to create great looking massive landscapes across the surface.

We use Substance Designer to create our height maps, normal maps, color maps, and any additional data we use for scattering objects on terrain.

This makes Substance uniquely suited to our needs compared to other node-based tools specifically focused on terrain. Tiling is one the features that come straight out of the box with Substance. Substance is also versatile and powerful enough to allow us to create our own terrain simulation nodes, something Sebastian will dive into deeper a bit further down.

Planet Global views

For our planets’ global textures we mostly use Substance Painter. We’ve built up a little library of terrain-based alphas to help with the painting process. We often start working from a concept painting, and we can easily translate the details onto our actual model using Substance Painter. This global material then gets applied in our engine and blended with the materials and height maps you’d see when you start moving closer to the surface.

Custom preview models

Another really cool thing you can do in Substance is adding your own 3D models. We used this option to add some custom preview models for our pipeline.

We build custom preview models used for ground materials and geology. Since we create our ground materials to cover 4m, we have a 4m patch of ground with a scale model from our game. This allows us to always have a scale reference in our preview window and make sure we are not adding elements that are too big or focus on details that would not add anything but visual noise.

We use a similar idea for geology materials but then on a vertical plane covering 2m. It’s just useful to keep an eye on the scale when creating convincing materials.

Custom nodes

Over the last two years, we have built up a nice library of custom nodes, tools, and utilities that make our lives easier. Even tiny repetitive tasks or often-used combinations of nodes are worth putting into their own node and shared among the team. Over time it really speeds up things.

One example would be a node that will take an atlas texture with a grid of up to 25 slots and split them out into individual components to be used in other nodes like tile samplers.

This way we can manage and maintain atlas sheets of specific themes and assets in single files. Sharing and keeping track of these has become much easier. We have one location where all the artists store their atlases; swapping out a theme or asset type is now as simple as swapping out the atlas.

Another example of a useful utility node is our height_validate node. We balance all our height maps for terrain displacement and parallax occlusion mapping around the 0.5 midpoint. Keeping the same midpoint or ground level for all these maps ensures we always have a nice blend and transition between materials and terrains. The height_validate gives us a debug view that help artists balance out the height map maximum range and midpoint.

Technical and Artistic Goals

Michel: The experience we are building is one that allows players to seamlessly enjoy exploring a universe, from the surfaces of different planets, flying over vast landscapes and up into orbit, and witnessing the spectacle of seeing the planet in its entirety standing out amongst the stars.

We want to make sure the experience is visually pleasing at all these ranges and feels detailed enough. This means have to verify our work on multiple detail levels. Making sure the terrain, models, and textures hit the quality bar expected from a modern first-person game with plenty of polygons and texture resolution all the way up to seeing the terrain from a ship flying over it and seeing the large-scale terrain shapes and colors of the landscape.

Sebastian: Our terrain needs to be interesting but also physically plausible to avoid evoking the feeling of uncanniness in our players. From a technological point of view, the biggest hurdle is that the data needs to be tileable. At the time, no software capable of simulating geological processes would provide us good enough results that fulfilled both requirements.

With Substance already part of our terrain pipeline due to its procedural workflow, which allows us to create varied and interesting base shapes for our terrain and adjust them with very short iteration times, it only felt logical to investigate whether we could somehow get the missing crucial geological processes in there.

Technical Breakdown

There are decades of research available online dealing with all aspects of the simulation of geological processes. The main challenge for us was how to run these algorithms for hundreds of iterations within Substance Designer without crashing the software.

We came up with a graph structure that allows us to do just that. As an example of it, we will set up a “Conway’s Game of Life“, as the algorithm is very straightforward and has clear visible results for each iteration, which unfortunately can’t be said for most erosion simulations.

Our goal is a graph structure that nests graphs with higher and higher iteration counts like this:

The 1it graph is the smallest building block and contains a single iteration of our algorithm. We want it to just contain the inputs, the algorithm itself, and its resulting outputs. In our example, the iteration is a straightforward adaptation of the ruleset for “Conway’s Game of Life,” which you can see on Wikipedia, within a pixel processor. We are offsetting the neighboring pixels to compare them with the “current” pixel, compare the result to the rules, and adjust the output accordingly.

we limit our use of regular parameters to the iteration count only, since keeping parameters between thousands of nodes via the regular parameters approach is giving Substance Designer a hard time.

The 10it graph is where we begin with the nesting process. We use blend nodes to check whether an iteration node should be part of the calculation by comparing its iteration number to max iteration count.

The blend nodes are set to “switch,” which helps us save performance by not doing calculations beyond the desired iteration count set by the user. In the blend node we use a custom function for the “Opacity.”

The amount we subtract increases with each blend: For the first one we subtract nothing, for the second we subtract 1, for the third we subtract 2, etc. …. This means that the blend will compare the node’s iteration number with the total amount of iterations, and if the conditions are true, it will allow the node to contribute to the final result.

For the 100it graph we use 10it nodes instead of 1it nodes, but the rest is a very similar setup with slightly different logic.

We check which group of 10s the blend is responsible for and let it pass data through accordingly.

Since the 10it nodes do have an “Iterations” parameter, we do the following in its custom function to determine their group of 10s.

For all further groups of iterations, we replicate the same setup but for groups of 100s, 1000s, etc…

For our example to work, all that is left is to make sure that our input is 1bit (black or white), and then we can scrub through the iteration counter to see life emerge.

This is the result from the first 100 iterations:

Using this setup we can implement a lot of previously impossible algorithms that allow us to expand Substance Designers functionality.

For most algorithms, you want to expose additional parameters and to get those to where they need to be; we simply use additional inputs and feed them that way to the algorithm.

If we need to modify and transfer data between each iteration of an algorithm, we use as many “data” nodes as necessary. Note that each will require an additional chain of blend nodes set up the same as above.

Describing the implementation of the research papers our nodes are based on in sufficient detail would turn this into a very long interview.

Water flow over terrain based on Navier-Stokes shallow-water equation:

Erosion over custom noise with water flow used for coloring.

All of these nodes use the graph setup from above, which allows us to run as many iterations as necessary.

We also made some more utility nodes to help with specific effects.

A logistic function based terracing node with a mask to control steepness:

Our favorite features in Substance

Michel: It’s hard to pick a favorite, but I’d say my favorite thing would be the non-destructive and flexible nature of the software. It allows for experimentation, artistic expression, and the creation of tools/utilities at the same time. That is an incredible level of power and control.

Sebastian: Since their introduction, 32f values and the pixel processor have to be my favorite.

Tips and Tricks

Michel: If you find yourself often rebuilding a certain combination of nodes, to achieve a specific effect or make a specific type of mask/selection, consider putting a little bit of extra effort into compiling them in a standalone node that you can add to your library and easily drag into whatever graph you are working on. Building up a little library of personal favorites can quickly become a timesaver in your daily workflow.

When working in a team, make sure you agree on a shared location for custom nodes early on and make it easy for everyone to regularly update that folder. This way you avoid running into issues while opening nodes that have a lot of references to other nodes that are hidden in various areas, like personal drives.

Having to find these missing nodes and fix broken references can become a frustrating experience and cost a lot of time in a project.

Anything you would like to add?

We’d like to thank the Substance team for the opportunity to talk about and share the work we’ve been doing on our game!

We are always looking for talent to join our teams across our various studios. If you are not shy of thinking outside of the box and are interested in pushing art, technology, and pipelines to a new level, then have a look at the open positions on our webpage or consider sending in an open application. We would love to hear from you if you’d like to be part of an ambitious project that is trying to break new ground in the gaming space.

Thanks for having us!