Around the Verse: Lighting the Universe Written Thursday 18th of May 2017 at 03:28pm by CanadianSyrup, Desmarius and StormyWinters, Sunjammer As per usual, anything said during the show is subject to change by CIG and may not always be accurate at the time of posting. Also any mistakes you see that I may have missed, please let me know so I can correct them. Enjoy the show!

TL;DR (Too Long; Didn't Read)

Studio Update Great strides have been made in LA in regards to Item System 2.0

The Origin M50 Interceptor was very carefully used at the test bed for its implementation and better tool creation and allowed for the balancing of ship systems with components rewarding multiplayer game-play

QA aided by converting existing checklists to the new 2.0 framework

Engineering improved persistence and inventory that allows queries of modified data for entries not even spawned as well as addressing abandoned items and ships while out of game

Progress has been made for transporting ships within ships and in hauling the Ursa Rover without explosions

Quantum drive was also converted allowing it to store nav points and use the pipe system for fuel and power checks, soon to look and sound as good as it performs

LA has also implemented an autopilot system into the Intelligent Flight Control System (IFCS) and cinematic support for thruster animations

The ship team has completed the Aurora ES and LN seat geo and has begun on the engines for Item System 20 framework utilization as well as nearing final gray box and animation for the Anvil Terrapin

Tech animation developed a stand alone installer of samples to save both themselves and outsourced partners time while upholding CIG standards

Source (SRC) rigging codes have been created to allow for quick, easy and bug free rigging and animation updates

The tech art team has now created a new data structure allowing for eye color customization

The LA engineering team has used item port tags to allow consistency between head and body coloration, created a process to simulate engine trails in atmosphere as well as new tools increasing efficiency past the first LOD and thus the ship pipeline delivery time and moved forward procedurally concerning outposts and their props

The tech art and character teams have added a fully rigged female medium marine, a male heavy outlaw suit to the armory as well as moving forward on many other character accoutrements allowing the UI team to establish necessary boundaries when helmet interior work is finished Behind the Scenes: Lighting and Fog Nathan Dearsley, Chris Campbell, Emre Switzer, Maria Yue and Ben Parry

Bad lighting can make good assets look terrible but good lighting can make bad assets quite nice

You can use lighting and colour it as a way to create a continual storyline from start to finish

CryEngine out of the box was really good at creating outdoor environments but was very cumbersome for good interior lighting

Ultimate challenge: make the lighting scale from room under the stairs to a whole galaxy

Power of lights are kept relative from decorative table lamp to the sun: creates challenges for auto-exposure

Lighting has to be dynamic: must support shooting out all the light sources in a room

Halfway there: anamorphic screen space flares are in, realtime cubemap generation in soon, the old sun is out (one sun to rule them all and in the darkness, light them)

Planetary lighting is completely driven by the atmosphere of the planet

Lighting also include colour grading and post effects, e.g. reducing the saturation when injured

All the lighting in the ships is 100% dynamic and 100% physically correct

Lighting work with Design to ensure light placement and callouts are correct, e.g. highlighting a useable panel

Previous lighting "layering" system had issues: file sizes were huge due to thousands of (largely unused) lights lights were simply on or off: no transitions or colour cycling

New Light Groups system each room has its own power state and can be enabled/disabled controller create the transitions and can e.g. react to damage at a location

Finding the balance for exterior and interior lighting is a challenge

Lighting plays a huge role in conveying the mood of an environment

Lighting makes all the elements in an environment cohesive, guides the player and enhances gameplay

Lighting impacts VFX as particles aren't lit the same way: SC uses direct lighting and cubemaps but requires some tweaking

Each light has three elements: emission power, light feature and an actual light entity

Old fog tech had a number of issues: didn't react to lights in any way so artists had to approximate it over-brightened shadows making the scene look flat used a very simplistic approach to transparent objects

New dynamic lit fog system has been integrated from Lumberyard lights now affect it uses voxels to determine colour and density



Full Transcript

Intro With Sandi Gardiner (VP of Marketing), Forrest Stephan (CG Supervisor). Timestamped Link.

Sandi Gardiner(SG): Hello and welcome to Around the Verse, our weekly look at the development of Star Citizen. I’m Sandi Gardiner.

Forrest Stephan(FS): And I’m Forrest Stephan, for the past week you’ve been intercepting messages about a top secret ship in development. Here’s the latest transmission.

SG: As you probably guessed, the ship in development is the Aegis Eclipse. Starting tomorrow the Eclipse will be available as part of our latest concept sale, you can check the Comm-Link tomorrow to learn all about the Eclipse… it’s role and it’s history.

FS: And in today’s show we will see how story and tone can be reinforced through the art of lighting but first let’s go to Los Angeles for their studio update.

Studio Update With Eric Kieron Davis (Senior Producer). Timestamped Link.

Eric Kieron Davis (EKD): Hi and welcome back to Los Angeles. I'm Senior Producer Eric Kieron Davis here with your monthly studio update. This month we've made great strides and finished a variety of tasks across both projects, and the team here in Los Angeles continued to grow which really helped us knock things out quickly. Now in the past we've talked all about Item System 2.0 and its impacts on the myriad of game features. Regarding its impacts specifically on ships it is an improvement to how players can interact with ships and their systems such as how adjustments to item settings can affect game-play. Now our tech design, engineering and QA teams have made steady progress in their various disciplines.

Now, in our endeavor to reach the goal of rolling out a fleet of Item System 2.0 ships with updated or new items that can be loaded onto them, we've now successfully the converted the Origin M50 Interceptor to fully utilize this new system. We chose to start with this ship, because it's the least complex example while still allowing us to discover issues that we can address for all 49 flyable ships and beyond. It's been the perfect test monkey. No offense M50 pilots and you've probably learned from your own experiences that one tends to be a bit more meticulous the first time you attempt something. We do the same thing at Star Citizen by properly documenting all necessary steps, thereby creating guides to speed up future processes. In our first round through we also looked to identify opportunities to create tools further speeding up our overall implementation time and this attention to detail has really allowed us to balance power usage, heat generation, associate EM and IR signals and balance hydrogen and quantum fuel consumption. This will also give players a reason to consider upgrading their ship components and make multiplayer game-play a bit more rewarding.

Now QA aided with this conversion by taking an early look at the ship and determined how to convert all existing checklists to the new 2.0 framework. When making any impact to our game QA has to test everything which in this case included all the different interaction points. Now, prior to the interaction points it was limited to just testing entering and exits, but checks were added for ladder enter/exits, entry enter/exits, power on and off, engines on and off as well as looking ahead for features not yet implemented such as ejections and cases which more than one player attempts a particular interaction.

Now the engineering team has also made strides in the areas of persistence and inventory. They're currently working on creating a technique for clients to request persistent information. This work supports several large features in 3.0 including cargo, shops, commodities, air traffic control, ships, players and a whole lot more. It will allow game code to query a for modified data for entries that aren't even spawned such as selling cargo for a ship that's landed at a station and hidden away from ATC. These features will also allow game code to correctly respond and orient ships or items that have been abandoned on planets or in space. Meaning you can expect the world and your possessions to remain in the same state in between game sessions unless of course a pesky pirate comes along and does what pesky pirates do.

Now we've also made progress on the system which allows one to park their ship inside of another. This should be pretty straight forward as possible and result in being able to transport any stow-able ship safely from point A to point B. This was based off of a rework of the landing mechanic that is currently in game. Now the new docking areas are set up the exact same way as landing pads used within the current Universe, taking components with a different interface and a new mechanism for locking. There's also been some work on the physics of getting the Ursa Rover to sit inside the cargo bay of the Constellation Andromeda without it popping through the walls and jittering. So, in other words, hopefully physics won't go wild and blow everything up … literally.

Now the team that also now has converted the basic quantum drive to Item System 2.0 giving it the ability to store quantum travel and other nav points. This means that all all discovered quantum travel points are able to be set as travel destinations for use at any time regardless of distance and signature strength. Now the next goal is to make quantum drive look and sound as awesome as it behaves by connecting the VFX and the audio to the actual transit. This also involves working closely with design on a way to better display them to a player in a logical interface. And then from here we can move on to pure 2.0 systems as quantum drive now uses the pipe system for fuel and for power checks.

Now also this month we've implemented a few new features into our IFCS or Intelligent Flight Control System. On the physics side we've now implemented an autopilot system to allow our AI and any other system to utilize IFCS like takeoff, landing or quantum drive, or anywhere a ship control really needs to be automated. And we've also added some support for cinematics to be able to automate the motion of thrusters on ships so they don't need to hand animate every thruster action in the cinematic. The thrusters on a ship will now behave as intelligently as they do in our current game.

Now our ship team has been making very steady progress on the RSI Aurora since our last update. The art team has now completed the seat geo for the ES and LN variants and begun work on the engines while tech design is implementing these new assets utilizing the Item System 2.0 framework directly into the ship archetype making this our first scratch built Item System 2.0 ship. Also the Anvil Terrapin's exterior is nearing completion of the gray box phase and has near final animation. Our tech content team continues to improve performance by automating and improving processes.

As you know the scale of Star Citizen is such that even large teams need some additional support in the form of outsourcing partners. One of the difficulties with outsourcing tends to be insuring a team's refined processes are adhered to, and assets delivered meet all requirements for simple integration into our game. As you've heard in the past there are many pipelines and processes within Star Citizen, and some of them are more complicated than others. On-boarding and outsourcing team requires that the tools can be installed and run in an external environment with limited support from us really in order to save time. So this month the tech animation team developed a stand alone installer that automatically mounts sample assets, tools and documentation no matter if it's for MotionBuilder or for Maya. We can now easily bring on board any potential partners quickly saving both them and us time as well as these same partners benefit from our extensive internal tool development that we did for our own needs.

Tech animation is also responsible for character skeletons and like all things creating a character skeleton can be done manually or automatically. Typically in the games the rig is not really that complex, nor does it change often, thus the manual approach could actually save time, but when you are on the cutting edge of technology updates are often required. For example, an animation engineer may require the addition of a specifically named joint for code purposes thus requiring changes to all skeletons in the game. This would be very time consuming if done manually, but we've now completed our SRC or Source rigging scripts, and can make these kind of updates quickly, easily and bug free. The time and energy saved is not only for the rigging team but also for the animation team, who will be utilizing these skeletons day after day. Now a programming analogy would be to think of the rig as a compiled executable. The SRC rigging scripts are the source code. So if we need to add something to the skeleton, we update the source code and compile it rather than patching the executable. You would just really build them anew.

Now changing gears a little bit, up until now all of our character's eyes have been more or less the same, but the tech art team has created a new data structure that will allow players to customize their eye color. This supports the first pass of the Character Creator where players will be able to select from a preset color palette. Now the tech art team has also taken advantage of a feature recently provided by the LA engineering team. Through the magic of item port tags the body skin tone will now automatically adjust to the skin tone of the face. In the case of NPCs this will maintain consistency for our characters, and in the case of players this will ensure your body always matches your face. Also they've created a process to generate SDF or Signed Distance Field volume textures, which are used in conjunction with our atmospheric flight model to simulate engine trails. We've made solid progress on art tools for our various art teams, and one such tool is our Unbevel Tool which simplifies LODs or Level Of Details creation process to increase performance on anything beyond our first LOD and speed up the delivery time for our ship pipeline. We've also taken large steps forward on our procedural system for outposts including color tinting, material variation and even variation of props and their placement within the outposts.

Lastly our tech art and character team have added more armor to the armory with a fully rigged female medium marine as well as a male heavy outlaw suit that we've shown in previous updates going from now concept all the way into final implementation. We're also far along on many new uniforms, costumes, characters and heads. The male OMC light is wrapping up its initial high poly pass and moved on to in-game mesh creation. The male Shubin miner uniform has moved to in-game texturing now that the in-game mesh is complete, and a new shipjacker uniform for Squadron 42 just finished up concepting and is on its way to high poly. Our female marine BDU finished up sculpting and is now headed to in-game modeling, and with the FOV slider work in progress for 3.0 the character team is doing a bit of work on our helmet interior starting with the heavy outlaw and the heavy marine which is used by our UI team to establish necessary boundaries.

Well that wraps us up for this month's update. We really enjoy bringing you these in-depth looks into our progress. Thank you so much for your support, and see you again next time.

Back to Studio With Sandi Gardiner (VP of Marketing), Forrest Stephan (CG Supervisor). Timestamped Link.

SG: Thanks Eric, it’s great to see all the detail going into character customization, even down to eye colour.

FS: Yes, we really want to give the players the ability to create unique characters and this is just the start. As Star Citizen grows, so will the possibilities for character customization.

SG: It’s all about building a believable universe and one very important way to do that is through lighting, use of shadow and fog to help set a scene.

FS: Which is why our lighting team is building a tool that can handle a game as large as Star Citizen, take a look, pretty cool.

Behind the Scenes With Nathan Dearsley (Vehicle Art Director), Emre Switzer (Lighting Artist), Christopher Campbell (Lead Lighting Artist), Benjamin Parry (Graphics Programmer), Maria Yue (Lighting Arist). Timestamped Link.

Emre Switzer (ES): My names’s Emre Switzer; I’m a Lighting Artist here at Cloud Imperium Games.

Nathan Dearsley (ND): Hello my name’s Nath; I’m the Vehicle Art Director here at Foundry 42.

Christopher Campbell (CC): I’m Chris Campbell; I’m Lead Lighting Artist at Foundry 42 at Frankfurt.

Maria Yue (MY): Hi, my name is Maria; I’m the Lighting Artist for Star Citizen.

Benjamin Parry (BJ): I’m Ben; I’m a Graphics Programmer and, by habit, I’ve become the volumetrics and and the lighting kind of guy.

ND: Lighting in general my opinion of it is pretty much the most important pass. It goes into an environment or a ship or a planet it before it goes out the door.

CC: Bad lighting can make good assets look terrible or good lighting can make bad assets quite nice actually.

ND: You can have substandard assets and light it well. We don’t make substandard assets so we’re pretty lucky with really good quality to start with. But it the basis … it’s not cake: it’s the cake and the cherry on top.

CC: Lighting is the character of the scene. It creates this feeling of either gloominess or happiness. You can use lighting and colour it as a way to create a continual storyline from start to finish. So like a story can start warm and happy and then by the end can feel cold and more bluish. And that’s all told through lighting.

ND: We started with essentially CryEngine, which is now Lumberyard, quite a long time ago. And the game and the engine was very focused to delivering a certain type of scenario. So it was out of the box really good at creating outdoor environments. It had a sun/time-of-day system. Interiors: it wasn’t so good. It fell down in a lot of areas - you could certainly get good results with it but it was very cumbersome.

ES: It’s lighting systems were mainly built for either large, open “smallish” levels up to about four kilometers. It didn’t really account for really dynamic worlds.

ND: So over the years we’ve tried our best to cater the engine to something that will scale. And scale from your little basement under the stairs to drawing a whole galaxy. So that’s like the ultimate challenge.

CC: The scale of our lighting is really interesting. The sun is a light source all the way down to a small decorative light on a table or something like that. We try and keep the power of those lights in relative terms. So obviously the sun needs to feel hundreds or thousands of times more powerful than a little desk lamp. And that creates interesting questions for how the camera auto-exposure works. So how it feels when you come from a small dark room and walk into a brightly lit exterior, like on the surface of a planet or something that … We need to create that feeling that there’s a real difference of intensity between these lighting sources.

ES: In Star Citizen the idea is that you’ll be able to go up and literally shoot out every single light in an environment and the environment has to be able to react to that. So how do you build that using the existing tech? We couldn’t so we had to retool a lot of things. There’s all these variables now at play and the tech wasn’t there so a lot of the rooms that the Graphics team has been working on has been to allow those things to exist.

ND: We’re halfway there I’d say. We’ve approached many different things in several different ways. So, for example, last year I think some time you saw the anamorphic screen space flares come in - again that’s tied to the lighting system. We’ve got on the horizon realtime cubemap generation. The whole sun system that we had has gone - there is one sun in the galaxy which will light all of the planets.

So there’s the guys in Frankfurt obviously who are developing the planetary tools and the lighting system for all the planetary tools … are in the planetary tools I should say ... so it’s completely driven by the atmosphere of the planet. And for an artist to get his head around all that at times is pretty challenging but it’s good fun.

CC: We do use lighting for creating a change in the physical - like in the player character - as well, not normally in the lights themselves but through things like colour grading and post effects on the camera which generally falls underneath the lighting umbrella as well. So changing the colour of the screen, like either desaturating it or adding more contrast and stuff like that, that’s part of lighting as well. Like if the player is hurt or injured the colour grading can react in a way that it either destaturates or makes the colour more vivid or something like that.

ND: When it comes to the ships you’re talking really about a different set of challenges. All the lighting in the ships is 100% dynamic and 100% physically correct. Hence we have a physically-based rendering system.

ES: You do want to have some sort of feedback loop with the artist to make sure the lightbulbs are positioned property, to make sure that even within a small ship like that your eye is looking at the specific things that you want to call out. So if there’s a turret somewhere you want to know that the turret’s there: it’s not hidden away in darkness. So there’s definitely callouts that we notice but it’s also a collaborative effort: going back and talking to the designers and artists and making sure that if there’s a component you interact with on a wall that you know that that component’s there. Maybe it’s flashing if it’s damaged or something.

ND: The challenge with that is you need these tools put in place to make that happen. Now what was happening up until a few weeks ago now: we have a layering system. So you’d essentially group lights into small groups and switch them on and off at different times during the ship’s state - so if it was in an emergency state you’d switch the default state on … off sorry … and switch the emergency state on.

Now that works in theory but it has a lot of problems with it. First problem is your Cry file or your Lumberyard file ends up being obscenely big because we have thousands upon thousands of lights that essentially three quarters of them, most of the time, are switched off.

And the transition between one state and the other is it’s on and then it’s off. So you can walk around the world today and you can go into this room and you can switch these office lights on and they’ll have a distinct style when they switch on: they might flicker, if they are an LED they might come on to a temperature and cycle through a temperature colour - we have temperature charts we use that are in the engine so it’s completely correct and that negates things going wrong colour-wise. So we wanted a system where we could transition from on, in a very creative manner, to different states, whether that’s evacuation or auxiliary or even to off.

ES: We now have the Light Grouping system so that each room has its own power state. So you can go in, you can enable or disable power to a specific room, you can … that room can take damage and now maybe that has to be put into an emergency state.

ND: That controller is creating these transitions, for me, is the artist to control. So when ship A takes damage at location B, everything within that radius of the location starts to use this system and when you actually see it working it’s really quite powerful. And it goes to show how powerful lighting is because you can completely change a really ambient soft feeling environment into something that feels very, very aggressive extremely quickly. Just through light alone. Nothing else.

CC: The challenge is finding that right balance. If things are out of whack then it can feel like when you leave a small interior and the interior is too brightly lit then all of a sudden the sun feels really underwhelming by comparison. Or vice versa if it’s really bright outside and you walk into a really dimly lit interior then it’s just pitch black and it just doesn’t feel very immersive or helpful for the player if you can’t see where you are going.

ES: I think that there’s a general vibe that every single level tries to achieve. There are some levels that are vibrant and you want to be welcome there, you want … or the goal of the Art Director is to make you feel welcome there. It’s a nice, calm place. And then there’s the other side of things where it’s tense and it’s … something like Grim Hex where stepping in there you might want to watch your back, right? So there’s definitely different moods that the environments want to convey and lighting plays a huge role in that.

It all starts out as a concept. There are some ideas thrown around and then the Design team goes and blocks out the environment and gets and idea for the forms and the shapes, as well as the gameplay and the pathing that the player’s going to take. Then Art goes in and details it using all of our modular sets that don’t necessarily mesh together very well. Then they do decal passes and prop passes to bring it all together. But lighting is really the thing that makes all those elements of the environment cohesive: it’s blends all of the different assets that we have together and guides the player in the right direction and enhances gameplay as well as just overall makes the composition of the level as good as it can be.

CC: Lighting also heavily affects … heavily impacts visual effects because things like particles aren’t normally directly lit in the same way that basic geometry is lit. They … in our game they receive lighting from direct light sources and also from cubemaps to give them an ambient lighting feel. But that’s not always … it doesn’t always look directly the same as the environment might look. And so there’s a lot of balancing and back and forth between the Visual Effects Artists that they tweak their particles to the same level that the lighting looks and vice versa that we also try and keep that in mind so that we don’t create a situation where nothing can work.

MY: So what’s starts here is all the interior area is ready. Of course for now it is purely dark. And the role for the Lighting Artist is once we light the room we’re going to tell the space. But how we light up this thing is basically introduced by the atmosphere from our Art Director. So here is a good example from our Art Director. So this is the lighting setup before and this is what we are trying to achieve.

So according here … because we have the different version of the light - we have three different version of the light. First one is a fake light source which going trigger the emission power. So here is a light feature and what we do is we linked it to the emission power to turn them on. So each space, once we have the feature, the light should come from the direction of the feature. However in this industry what we did is we have this lighting feature first which is going to control the emission map and then we have another actual light which is going to tell this space where the actual light comes from.

And after we set up all this space we’re trying to push different colour tone for character and warmth. And once I’ve took down the fog then then I’ll will try to get them even closer with the guidance from the Art Director. So, basically, this is how we work.

Usually once I’ve done the lighting I’m going to do the character casting because character is a very important part of the game. So usually once I’ve done the lighting setup I’ll just use this test feature to see him walking around and make sure different positions is light is going to cast on the character correctly? And is he going to able to see his character?

And also we have two different light setup. One is the cold light that you can see casting from each exit of the door: it’s really cold. So in that case I design some warm light to make sure the character always have different cold and warm tone to mix and make it look more interesting.

CC: There’s new lighting tools created on a … probably a weekly basis at this point. We’ve just recently integrated a first pass of our lit fog technology. Which is basically a way of transferring old fog, which is very … I mean it has depth but feels quite flat in the way that it renders the scene, but this new technology allows us to … gives you a sense of where the light comes from or light sources can actually cast light into the scene from their source.

BP: At the moment the old fog doesn’t react to lights in any way. So what an artist will have done is they’ll have put fog in an area and they’ll have set the colour and the thickness of it to roughly approximate what it would have looked like if it had had lighting on it. So as an example if someone’s put a red light in a room, they’ll probably have put some red fog there to go with. What they’re actually trying to get the impression of is some very thin white fog with a really strong red light on it. So now when you put some really thick red fog in the room and you shine a red light on it it’s going to go completely opaque and it’s going to be incredibly red and it’s going to look terrible.

What it’s actually doing is it’s basically just drawing a large cuboid onto the screen and because it knows how far into the scene the opaque objects are in that scene it can work out how much fog it would have to put on here.

But it has a few problems.

So as a very simple example you can tell in the shadows it tends to over-brighten the shadows: it flattens out the effect of the entire scene. And the other problem we’ve actually got is if you add more lights you can see that the scenery lights up but the fog itself is just still this fixed yellow colour that I’ve picked in advanced.

Now another issue that it had: this is a transparent sphere and so because it doesn’t have any depth information it can’t actually apply the fog to this. So the old fog system, on the CPU side, it just does a very simplistic approach to this and it works out what … how much fog the very middle of the sphere would have and just applies it over the entire thing. So if I zoom in on it a bit, and then I lift it up, you can see that it stays fogged even as it pokes out and then just as it crosses the entire thing leaves the fog.

That was mostly workroundable but you often have problems with windows on ships or anything with a large canopy would suddenly … it would … the whole canopy would suddenly pick up the fog of the inside of the place.

ND: So now we’ve got dynamic fog, dynamic particles, to go in the lighting it’s incredibly cool. I’ve got a bit of a reputation for liking the fog and particles a little bit too much. It’s actually the second thing I do as soon as it goes into a level: you automatically get depth, you get a certain ambient and a mood via the fog, and it’s just incredibly powerful. It backs up all the hard work the lighting guys put into the levels.

[Room fills with fog]

And that’s intense. That’s very intense.

BP: So with the new fog you can obviously see that the lights are actually affecting it. We’ve got a spotlight here going into it. And what’s quite nice is that if you get down into the soup you can … you can actually very clearly see that there are these shafts of darkness where the shadows properly affect the fog.

So this is tech that we’re integrating from Lumberyard at the moment. It’s still in progress at the moment but if I switch over to the debug modes I can show how it’s working.

So this is just a horizontal slice that we’ve taken through the texture that we use. So what we’ve got here is a volumetric texture that, at the moment, is about a fifth of the screen resolution and about 64 slices deep. And so the samples are distributed towards the viewer end - you get more detail up at that end - just because the camera widens … your field of view widens in the distance the same amount … the same number of divisions are spread over many more meters in the distance.

But as you can see this rectangular volume has been inserted into the volumetric texture. It doesn’t bother inserting them here because it knows that there’s an opaque object so it doesn’t need to know what values it’s go there. So that’s just an optimisation. So that’s just the density and the colour of the volume that’s been inserted there.

So after that we have a second pass that it takes all the lights in a scene - and again this is just a single thread of the … a compute shader is run for every voxel of this volume. So into a second texture we take all the lights in the scene and we multiply them through with the density and with the opacity of the volume. We actually, you can’t really tell here, but it’s working out depending on your viewing angle, it’s sort of saying a light will scatter toward the camera more so I think, you probably can’t see it, but the highlights will change shape very slightly, maybe not, but also from here you can see that this dark lump here is casting the shadow from the main light that it’s then still receiving current blue light from the sides.

So then the next pass after that we actually do a little bit of blurring after this point, but the next interesting pass. What this is is it’s actually a rain march that’s been done through the entire volume. So at this point it’s worked out that any object that wants to have fog applied to it now just has to, it can just read a single point in the texture and it knows that’s exactly how much fog something at that distance would need. So up near the front you can sort of start seeing the fog coming in, but as you get deeper, anything about this point is going to get exactly the same fog drawn over it as about this point because it’s pretty much opaque by that point.

The great thing about that is that whereas the old transparency you have to just work out for a single object on CPU how much fog in general it would get, this you can now just, any pixel that’s being drawn can just read this texture and find out how much fog it should have so it doesn't have any of the same problems.

Another quite nice thing actually about this if we go back to this view. So this is now evaluating a noise function and just applying it onto the fog so you can sort of see the patchiness, slowly drifting around inside it and if I turn off the debug you can now see there’s a sort of, slightly more richness and slightly more kind of complexity drifting around which sort of lets you work into the scene a little bit more, like get more interest, get more variation.

In order to switch over we need to basically pick a date where every old fog volume in the game will break and every new one will start working and so it’s just a case of once we’ve got the tech in and we’re satisfied that none of the parameters are going to shift around and suddenly the density value won’t mean twice as much as it did yesterday or whatever like that. At that point the environment teams and ship teams have to go through absolutely everything that’s got fog volumes on it and just make sure that they all look good, or deleted them if they don’t, or replace them or whatever, check that the lights shining onto it don’t show anything that was slightly dodgy about how the lights were set up, all that kind of thing.

CC: It basically replaces the old fog technology completely. It looks better in almost every conceivable way.

BP: We’ve been integrating it from the most recent Lumberyard release that we’ve got. A lot of the work I’;m doing at the moment is just moving, taking that and integrating it with the things that we’ve changed in our copy of the engine. So minor things like exactly where you get shadows from for the sun, we’ve changed that to be slightly more efficiently, but obviously the new code is coming from a system that hasn’t done that so we just have to go patch that up, find where those parameters are coming from, find where that data’s coming from, make sure that it all feeds through in the right way, sort of hunt down bugs that are caused by differences between the two system.

ES: The fog, especially in space is going to make a huge difference. The U.K. graphics guys are looking into creating a unified fog system so even in the asteroid belt or any other kind of dull and plain looking assisted points of rocks floating around, but in space you have a ton of ice particles, you have a ton of rock particles, you have all these little dust particles flowing around and that creates volume. So really one of the focuses going forward is making the spaces feel more alive and like there’s matter there, like there’s stuff that you’re passing through as you’re flying through the asteroid belt and that’s driven by the fog system. So that fog system is going to be massive.

ND: having that in the engine is incredibly cool. You can create a sense of depth just with fog alone and as soon as you introduce lights dynamically reacting to that fog which is what an artist would spend a long time trying to recreate, it’s an incredibly powerful tool to have to be able to guide players like we touched on before and to create a sense of depth away from the camera. Sometimes the things that you don’t see in the world and your mind makes up what this is is far more powerful than actually seeing that asset so strong silhouettes and things like that is a very kind of distinct and cool style in my opinion.

MY: The fog was on the different floor so if by default the fog gonna come really intense. So it depends on what kind of thing you are working for. I can show this is the default so it comes as a volume, but you can activate the fog, scattering with the light. So it depends on the situation of each thing, we have to design like where the fog is coming from or what may cause the fog. Usually the fog effect shows up on the brightest point. See we have a hotspot around the ground and we’ve got the window, the lighting travelling through here. So in my way how I design the fog scattering is like alongside is the direction of the possible volume lighting source comes from. See here is the window so that’s why the fog kind of falls in that way.

So this is actual lighting source and again, if I turn it on and everything comes to life. The entire sense of lighting wise, it’s consistent with basically a makeshift light fixture and the real lighting who tell this space and the fog effect.

ND: It’s brand new and it’s something coming this week and we’re at the stages now scaling it up so it’s going to work in an environment like this. Great reference, thanks guys, but we obviously need to make that work on a scale of a nebula which is bigger than a solar system.

CC: I’m very excited about the lit fog. It’s something I’ve played with for a few years and it constantly amazes me with how much it improves the atmosphere of an area. It just makes things just the air feel thicker and you can really feel like you’re in this space. Like every single day I grab a new build and there’s always some kind of new thing that’s just a new value that I can tweak that just make things a little bit cooler and it’s really exciting being able to see that kind of stuff.

ND: Wow.

Outro With Sandi Gardiner (VP of Marketing), Forrest Stephan (CG Supervisor). Timestamped Link.

SG: I bet the crew had a lot of fun covering Nate in all that fog.

FS: Does that count as research and development?

SG: I’m sure it does and I’m going to say yes. Before we go I just want to remind subscribers that this month’s issue of Jump Point will be available tomorrow. Subscribers can also fly the Drake Buccaneer as part of the ship of the month and if you’re interested in learning about our subscriber program, check out the link in the description.

FS: That’s all for this episode of AtV, Happy Hour friday returns tomorrow at noon pacific. The talented Josh Herman will create another creature live on Twitch, so be sure to check that out.

SG: Super cool, let’s see if he can top the flying spider and I also want to thank all the subscribers out there, you’re the reason why we can make shows like this one and Happy Hour.

FS: Of course Star Citizen wouldn't exist without our backers so a big thanks to all of you.

SG: Yes and thanks for watching, we’ll see you…

SG/FS: Around the Verse.

CanadianSyrup Director of Transcripts A polite Canadian who takes pride in making other peoples day brighter. He enjoys waffles with Maplesyrup, making delicious puns and striving for perfection in screaming at the T.V. during hockey games. Desmarius Transcriber When he's not pun-ishing his patients or moderating Twitch streams, he's at Relay pun-ishing their patience as a member of the transcription staff. Otherwise, he can be found under a rock somewhere in deep East Texas listening to the dulcet tones of a banjo and pondering the meaning of life. "If you can't do something smart, do something right." - Sheperd Book StormyWinters Director of Fiction Moonlighting as a writer in her spare time StormyWinters combines her passion for the written word and love of science fiction resulting in innumerable works of fiction. As the Director of Fiction, she works with a fantastic team of writers to bring you amazing stories that transport you to new places week after week. Sunjammer Contributor For whatever reason, this author doesn't have a bio yet. Maybe they're a mystery. Maybe they're an ALIEN. Maybe they're about to get a paddlin' for not filling one out. Who knows, it's a myyyyyyystery!

Please enable JavaScript to view the comments powered by Disqus.