Changing the Worlds



The weathers not sure what it wants to do out here in Frankfurt, sunny and hot, overcast and raining, last month had a fair share of it all. The office grew by 2 people in May, Janine helping out with production and David joining the weapons art team. We’re just about at capacity in the office so we officially started an expansion. When we moved into the office space last July the owners of the building put the adjoining space on hold for us for a full year, one of the perks they offer to new tenants. At that time we didn’t know if we would have any need for it, but thought it was a nice gesture.

The team has also been doing some excellent work not detailed below on the procedural system… they would rather let you see it in action than spoil all the details now, so stay tuned on that front!

This month we also kicked things into gear with Gamescom, it only makes sense for Frankfurt to help out as much as we can since we live here. Not too much can be said at this point, but we’ll definitely be there in August.

Cinematics

This month we had our 2nd performance capture shoot for SQ42 at Imaginarium Studios in Ealing London that complemented and completed scenes in the earlier chapters.

Last summer was our main shoot that gave us approx. 75% of all needed material for SQ42’s story. With this new shoot complete and another final shoot upcoming in July we will have collected every bit of performance! Highlight of this shoot were scenes with the Idris’ quartermaster handing out big guns as well as an in-universe spectrum show with a furious and most importantly hilarious host!

For the shoot the Cinematic team built some of the needed environments to be used for live-mocap and coordinated efforts to get all needed level segments, characters sorted in time for live-mocap to be effective. (Live-mocap is a process where we can see the motion data live in our engine on actual in-game characters inside the actual scenes/levels)

Michael, our Lead Cinematic Designer, has finished a first pass on an important, longer cinematic piece that happens in the middle of the story campaign.

The Cinematic animators have been providing support to the Gameplay team in the UK while they wait for data to get back from the shoot. They’re also finishing up sorting the quad cam video data from last year’s p-cap shoot. When that’s done the animators on the Cinematic team will review the p-cap shots with the quad cam data and make sure that the animations are playing out as they are intended.

AI

Subsumption editor has been updated to version 0.901 and we started to implement some more functionalities exposed now in the tools for the designers. First of all we created some new Subsumption tasks to allow designers to use the following functionalities:

1. Querying the Tactical Point System

2. Move to a specific cover location

3. Shoot from the cover the NPC is in to his attention target

We exposed a way for designers to suggest the next activity or subactivity for a given NPC. We also completed the first pass to support Action Areas. As we explained in the monthly report from April, those are the elements in the world that allow designers to mark areas with specific information: a multicrew space ship, for example, might have an engine room, a hangar, a control room and so on. Action Areas allow the NPCs to reason about the environment to fulfil their tasks. Those game elements are also used to notify the NPC when another NPC is entering/exiting the area or to re-route specific events that are intended for the characters present in the area. We also implemented the basic code to handle Subsumption events, designers can create those specifying different properties (for example if the event is something the NPCs should see to perceive it, if they should hear it, if there is a max range in which the event can be received, and so on) and Subsumption keeps track of all this data.

We also completed the first pass on using the Usables/Interactors as navigation links. This unification step was required since Squadron 42 and Star Citizen are not only using the classic navigation links to allow characters to jump, vault, and so on. We also have links to connect the navigation through the usage of items (for example passing from room 1 and room 2 may require the AI to open a door) and these items can also be used by the players, making everything even more complicated. Using the Interactors makes sure that an NPC that opens a door doesn’t require any different code to synchronize the action with other players or other AI characters, making all the code flow much more consistent.

Regarding behaviors, we are currently embedding specific “mental states” to allow the normal NPCs behaviors to cope better with cinematic scenes and in-ship state (piloting a ship, controlling a turret, and so on)

Regarding the ships behavior we are almost done with the first pass to improve the behaviors of the ship assigned to restrict their movement into specific areas. We currently have missions where spaceships need to guard specific environmental elements or where ships cannot leave specific boundaries to avoid being destroyed.

Currently boundaries where always considered as a soft restriction on the ships behavior, but we are now expanding the ships behavior to correctly handle a strong spacial restriction.

This month we also spent time on improving stability and performances of the live game, so that all of you guys can continue to enjoy each release of Star Citizen.

Weapons

This month the weapon art team finished two new ship weapons and did extensive work on the weapon material library. On top of that we are currently modelling a new ballistic submachine gun and doing the final touches on the Scourge Railgun detailing and texturing. The Railgun is meant to be a visual target for all the new weapons to come, so we’ve taken some extra time to take it to a gold standard level of quality.

We’ve also been joined by a new starter, David, bringing the weapon art team size up to 4 people now.

Design

Level Designers have been working on finalizing the layout for the Outlaw Base, it’s had a first art pass and will come back to them for a mark-up phase before release. This base is also scheduled for a tiered release so there will be a lot of back and forth between art and level designers as areas get released an new ones get added. The guys are also working hard on the Truck Stop base which is a nice place for pilots to stop to refuel, grab a bite to eat, get some supplies before they head out on big journeys. Another area in which level designers are working in is adding landing sites on procedurally generated planets.

The system designers have finalized the Power Distribution system which once implemented should power all our ships from single seaters to capitals and even stations. This system will allow players to configure how power travels within their ships, which components get power, how much do they get and where do they draw it from. At the same time it should allow more nefarious players to sabotage said system. At the same time the Looting, Inner Though and the Usable/Interactor systems are also fully designed now and are heading into production. A lot of work went into ensuring that all our usable props and animation metrics for them are properly standardized so we don’t have to do the work twice and everything will fit together once it’s in game.

On the AI side we are designing a system that can generate archetypes and loadouts quick enough to fill an entire galaxy without us having to manually build each individual NPC. Basically by defining rulesets and tag sets the system will be able to create said archetypes matching their gear to where they came from, what their job is, what the dress code is in the area they are in, what rank they are etc. Besides that, there is also a lot of work going on in designing the tools we will need for the brains and logic of our new Subsumption based AI.

Both system and level designers have been planning out the work for the next year and the upcoming releases to make sure we have a more realistic picture of what can be done with the allotted time.

Engineering

Renderer refactoring: On the Renderer side, we did some housecleaning and optimizations. Based on the refactorings from last month to increase the object count, we started to simplify the data upload to the GPU. Previously the CryeEngine system is based on reflection, so that the code could find out what data was needed on the GPU and only upload this data. While this sounds like a solid idea, finding out what was needed was more expensive than a straight data upload. So we began to remove those reflection code paths in the time critical areas. This also improved the readability of the code, as we can now see what the code does and not the logic to find out what to do. Related to this change, we also ensured to only upload the same data once to GPU. Previously it uploaded a data buffer for each object, and then uploaded the same data again if the code decided to use instancing. This is now fixed.

Data Patcher: On the Data Patcher (the tool which will be responsible to create the data for the engine to use when we switch to incremental patching), we made a little progress by better defining how to store the data. Not much reportable progress here as much work is about infrastructure discussions.

Optimizations: To further optimize the streaming code, we added timeslicing support to it again. This way the cost to update the distance to objects not visible to the player is done less frequent.

Tag System comes into the ZoneSystem: Initial support was written to support storing tags inside the ZoneSystem. A Tag is a small string which we use to give context information to an entity object. For example if we want to know if something is a chair, we can tag it as a chair. This way the AI system can ask all objects and find out if they are chairs. We already have such systems inside the engine but those are lacking spatial information, so they can only answer: is this object a chair, but not “find me all chairs around me”. This lead to some in-efficient solutions as the code had to brute force get many objects and check their type. To overcome this limitation we are moving the support for tags into the ZoneSystem, our spatial position system. This required some changes and new systems:

Storing and comparing a string is not very efficient on a computer (but very convenient for a human, thus we need it), so we implemented a Trie to allow us to very an efficient way to map unique strings to a fixed integer range. (We wanted to get an integer range instead of a hash as a range allows us some better broad phase checks and more efficient data storage)

Since not all data types which we stored inside the ZoneSystem require tags, we made the whole zone system more flexible to allow the client code to specify the properties to store per object type. This also reduced our memory usage in crusader by 50MB.

We implemented a specialized allocator for the tags so they can be efficiently culled by the low level zone system, which is implemented in SIMD , so the tags must follow a certain size and alignment.

, so the tags must follow a certain size and alignment. And as a last thing, we implemented code to allow filtering tags by a DNF (Disjunctive normal form), which is a fixed format and can be used for efficient checking of arbitrary boolean expressions.

Runtime Skel-Extensions: The character customization system in Star Citizen is internally using an engine feature called “skinned attachments”. With skinned attachments it is possible to replace every deformable item on a character (i.e. cloth, shoes, spaces suits, helmets, etc) and even entire body parts such as faces, hands, or upper and lower body parts. Each skin-attachment has its own set of joints that are automatically animated and deformed by the base skeleton. It is also possible to use skinned attachments that have more joints and different joints then the base skeleton and it is possible to merge all types of skeletons together, even skeletons from totally different characters. That means you can have a minimalistic base skeleton which can be extended by an arbitrarily complex skinning skeleton. In the original CryEngine this was an offline- or loading-time feature, because the entire process was pretty CPU intensive. For Star Citizen we turned this into a runtime-feature that allows us to extend a base-skeleton anytime while the game is running, no matter if the character is alive and playing animations or in a driven- or floppy-ragdoll state. This means that you don’t have to know in advance the type of joints you might need in the base skeleton nor do you have to carry extra joints around just in case you might need them. Instead the system allows you to add new joints at will and whenever they are required.

Full-Body Experience: we also invested a lot of time to improve the full-body experience in first person. Our main goal was to make the head-bobbing customizable. In Star Citizen the head-bobbing is a natural side-effect of the mocap data, because third- and first-person are using the same animation. To make the controls as smooth and precise as possible, we implemented a new IK-solution to eliminate all unwanted effects from the 3rd person body animations on the first person view and weapon handling.

Tech Art

Frankfurt TechArt is always busy with RND and actively supporting any discipline who needs help. This month we were helping with Eye stabilization for FPS , which will stabilize camera movement while playing in FPS, the results so far have been really good. We’re also supporting feature or RND for the itemport IK system for game, it’s moving positively as well.

On our DCC[Maya] pipeline front it is becoming much more stable but still needs a bit more support as we keep updating or expending it. We were also fixing bugs and supporting weapon Assets on all fronts like rigs, updating rendermesh, entity setup, Mannequin setup etc.

Overall TechArt is participating for new features as well as keep supporting daily operations.

VFX

The past month the Frankfurt VFX team has been working on Squadron 42 single player missions. This covers almost all aspects of VFX such as ambient/environment effects, scripted action effects and even some cinematic effects. This also requires a fair deal of collaboration with the individual level designers and artists. Once they are done designing and building a section of the level with props, VFX can then move in and decorate it with the appropriate effects. Everything from fire and explosions to blood and water.

Quality Assurance

This month in QA we’ve been working closely with our in-house Engineering team to test progress on the procedural planets, as seen in the Pupil to Planet trailer. With such a large scale planet you can imagine the number of issues we encountered throughout our testing. Including, but not limited to the buggy insisting on driving on its z-axis, defying all laws of gravity!! We also spent some time checking all characters currently in Star Citizen and Squadron 42 for issues that might be blocking our cinematic developers. As a result, we were able to identify multiple Squadron 42 characters that had definitely seen better days. Sean Tracy, Ali Brown, and Okka Kyaw jumped on board to assist and we were able to resolve the issues quickly, so that our cinematic developers will be able to continue to create amazing cinematic sequences for Star Citizen. Lastly, we spent the remainder of the month assisting Chris Raine and Chris Bolte with gathering profiling and concurrency data for the PTU servers. The community was a great help and rallied together to load as many players onto the PTU servers as possible so that we could collect data with a hi-load server in order to contribute to resolving the framerate drops the public has been experiencing in Crusader. May has by far been our busiest month in QA, but we would not have expected any less as our universe grows larger each day.