My name is Mathieu Muller and I am a Field Engineer in EMEA and a film specialist at Unity. Last November, during Unite Los Angeles, I organized a panel called “Unity For film”, presenting various usage of Unity in the cinematic industry (The Gift by Marza Animation Planet, Disney’s Jungle Book Previz, the Adam short by Unity, the short Sonder by Soba Productions, on set video playback for the movie Passengers and the animated series Mr Carton by Tant Mieux Productions) together with some of the people who worked on these projects.

This week, Mr. Carton, the first cartoon series produced for television #MadeWithUnity is launching on the national french TV website.

Today, Michaël Bolufer, the creator, designer, co-director, co-scenarist, main lighter, and fireman animator of the 13 episodes of the two minute series, is my guest to talk about this production.

– What is the story of Mr. Carton?

Mr.Carton is a clumsy driver who wants to reach a lighthouse on the top of a mountain. That’s the plot! But every single vehicle, every little rock could be a danger.

Above all, the greatest danger is himself: He has poor driving skills, as if he was not supposed to be there…

– Michaël, are you from the game industry or the movie industry?

I started 15 years ago in a video game company (Etranges Libellules in Lyon) as graphics artist, then game/level designer, and cinematic designer. After 4 years in the game industry I wanted to go further, and moved to the cinematic industry as a CG artist and art director. Being from the games industry helped me a lot in the CG industry in regards to being efficient and knowing about the machine constraints. Now I am playing ping pong between the two industries, working as a contractor or on my own productions of both games and animated series.

– What game engines did you use?

I was using an in-house engine when working at Etranges Libellules, which was really great for what we were doing, especially the camera system. I used Epic UDK for six months, making a fighting system for a game. I discovered Unity 4 in 2013 at Artefacts studio in Lyon working as a contractor on a racing game, and making side projects. This is when I started to test my ideas about realtime animation.

– How did you come up with the concept? Did you want to use a realtime engine and created Mr Carton, or the other way around?

In 2008, I won a prize at the Annecy animation festival with the scenario of a short film about the game industry, and that would have been made with a game engine. The aesthetic of Feist was the revelation that a game engine could visually generate something so singular and attractive that it could be used for an animated film. This short never happened, but the idea of making a film with a real-time engine was planted forever. The worm was in the apple, as we say in French…

In 2012, I was making a lot of commercials which was very lucrative but boring, and I had a paper under my keyboard accumulating ideas for a story taking place in a cardboard world.

Then I did the first “Teaser” of Mr Carton with 3DSMax and VRay. My machine was slow, and the “worm” kept telling me that maybe using Unity would make my life better.

In 2013 it all came together. I was working in parallel on the writing of Mr Carton, a racing game made with Unity at Artefacts Studios, and a small side project on Unity for my tween girls. Unity 5 was just around the corner, and the french-german TV station, Arte, asked Artefacts Studios to realise a title sequence for their children program. And we decided to make it with Unity.

It went very well, and soon I was starting the production of Mr Carton using Unity. We were very lucky to have the full support of our beloved producer (Tant Mieux Prod.) in this adventure. They strongly believed in the potential of Mr.Carton and in the way we wanted to craft his universe using Unity.

– Can you explain the production and the pipeline and how it differs from standard CG pipeline?

Fabien Daphy co-director, Nicolas Le Nevé our chief storyboarder, and I, made the storyboard and animatics in ToonBoom. The storyboard is a key element to communicate with the production and the rest of the team all along the project. I made the first 3D models, rigs and animations in 3DSMax, and they were reworked, finalised and cleaned up by our 3D supervisor Olivier Roos using the same tools. Everything was exported to Unity through FBX.

Four animators (Raphaël Gauthey, Christophe Devaux, Samuel Chauvin and Fabien Bougas) and Benedicte Peyrusse, the animation supervisor, produced most of the animations over a period of about 2 months. We started with the principle of having all animations made in 3DSMax, including retakes. However we ended up doing more and more of the retakes directly in Unity’s animation window, because it was the best place to adjust things in the context of the shot (camera, lighting,…).

We soon had to hire an extra modeler (François Beudin) for a few months, because we ended up reusing less models than planned. Especially, developing the model with cardboard edges was very time consuming.

Fabien and I made the greyboxing in 3DSMax and Blender, based on the storyboard, and gave it to the animators and modelers to create what would become the prefabs of each scene. The scenery was a big object containing a bunch of individual objects already in place (mountains, roads,…) that we could adjust later in Unity. Trees, rocks, and other details were each an individual FBX and soon became prefabs that we painted in the scene directly in Unity using Quickbrush from the Asset Store.

Fabien and I did all the lighting and compositing, and had in average 2 days per episode for it. We were too lazy and busy to do external compositing, so we did it all in Unity. Color grading were per episode, and depth of field, bloom, ambient occlusion were per shot.

Where the workflow gets really different from standard offline pipeline (long CG rendering, compositing) is sequencing. We used Flux from the Asset Store to do all the sequencing of each episode directly in the Editor. We chose Flux because it was frame based and allowed a precise sequencing of animations at 24 frames per seconds. Nuno Afonso, the creator of Flux, was very reactive to implement features and fix bugs all along the production.

Each animation was provided with an extra margin around what was required in the storyboard, and we assembled animations, models, cameras, lighting, and compositing all together in the sequencer. The sequencer is where a big boost, time and creative wise, is happening in a realtime animation production pipeline.

Finally we exported the sequence to a video (via a custom script exporting PNG sequences) and did the final cut in Adobe Premiere. Each shot in the sequence was a bit longer than planned in the storyboard to allow the final cut to be made externally. It also allowed to cover precomputed real-time GI “pumping” that we experienced sometimes when transitioning in one frame between two distant locations.

Finally, all the music and sounds (made exclusively by mouth) were done live on top the animatic.

– How did you do the lighting?

We used Deferred (allows many realtime lights) and Linear (more realistic lighting) rendering. I generated a second set of UVs to use with the precomputed realtime GI in Unity for each object of the scenery. I did not use the automatic generation of UVs inside of Unity in order to keep a strong control over it. We had mainly one light doing indirect lighting, and the rest with direct lighting only. In some shots we used up to 40 lights. On some occasions we used extra indirect lights (eg: the UFO blue light bounced, and the red light lighting on the road was direct only).

We used a lot of projectors with cookies (more than 40, all hand made!) to have precise control over the lighting, and give life to the light. On some episodes we used volumetric lighting like in the image above for the lights of the car and the laser of the UFO.

For shadows, I used real-time shadows, mostly on the directional light and sometimes on a few extra lights. I also tweaked the quality settings per shot to get various aspects of the shadows. On top of this, I used two kinds of screen space ambient occlusions with different radius and intensities for extra shadowing (SSAO Pro and Unity free Cinematic effects pack from the Asset store). We did not use the latest Unity post processing stack because we locked our version of Unity before it became available.

These are a few examples among many. There is no recipe in lighting, only methods and tricks. The story is the one telling the lighting, and we used many different tricks on each episode.

– Did you have issues with aliasing?

To have the stop motion effect, we wanted to shoot at 12 frames per seconds. Animations could be created at 12 or 24 frames per second, but we only took even frames when exporting to video at 24 FPS. All this to say that we did not want any kind of motion blur effect! This discarded temporal anti-aliasing. So in the end we only used SMAA (Subpixel Morphological Anti Aliasing). We have some shots where anti-aliasing could have been improved, but we were generally satisfied with what we had and did not have time to work on it more.

– This cardboard looks great! How did you do it?

We used real photos of cardboard that we worked and cleaned up in the classic gaming tradition style, to tile and render nicely. On the entire seasons we use about 10 textures for it. We had about six shaders made in Shader Forge depending on the complexity of the characters, managing different normal and occlusion maps, plus the pen markings on top.

– Now the production is finished, would you say it was worth using a realtime engine compared to more “traditional” methods?

We probably could have never finished on this time and on this budget with an offline process. But actually the question never came out, because we wanted to do it this way anyway and everything went quite as planned. The worm in the apple was right!

– What would be your advice to someone who wants to start his own film with Unity?

Don’t try to redo what you can do now with offline. Build a good vision in the technical art direction, take the technical constraints as an art choice, and the art choice as constraints. It is consequently necessary to get some knowledge about how real-time rendering and engines work. And then, have fun with the creative freedom you will get!