They also completed Tech Animation work, such as implementing multiple animations into Mannequin for the Cinematics Team, improved the Playblast Tool to speed up the process of creating playblasts for reviewing purposes, and worked on improving the binder process to map animations from MotionBuilder onto our Maya rig.

API

NPC

The AI Team worked on adjusting the AI components with the newto allow a safer construction in different threads – it’s a fundamental step towards fully achieving Object Container Streaming. The Actor code has always been very dependent on Lua (it’s not easy to make a thread safe with good performance), so all the AI components are now being moved to either be fully C++ or Dataforge components. They also worked on a few core functionalities for Subsumption. Subsumption Missions can now define Event Callbacks: missions can receive and send Subsumption events and logic can be written to be executed in association with specific events as described by designers. This functionality is part of the overall effort to support designers in creating more modular missions, and enforce correct communication between modules that can preserve thread safety and avoid a ‘spaghetti code-like’ logic. They also extended the functionality of supporting multiple Mission Objectives for each Mission module. They continued work on improving the way ‘usables’ are defined and executed: designers can now create behavior logic associated with the different use channels of each usable type. For example, assuming there is a usable bed that might expose the following use channels: ‘Sleep’, ‘Rest’, ‘WatchTV’, ‘SitOnBed’. When anuses a ‘use channel’, it will effectively use some logic written by the designers in a similar way to Subsumption functions: this allows a more modular definition of the actions allowed when interacting with a usable maintaining the context of the behavior that is currently running.