Editing Deadpool and Hail, Caesar! in Premiere Pro

As an editor who cuts in Adobe Premiere Pro every day, it’s always exciting to see major Hollywood films make the jump into very much the same software. The Sundance film festival hosted an excellent, and tip-packed, panel discussion on two of these films, the Coen Brother’s Hail, Caesar! and Deadpool. The panel included Deadpool director Tim Miller, editor and Premiere guru Vashi Nedomansky, post-supervisor and associate producer Catherine Farrell and additional editor Katie McQuerrey.

In fact there were 51 feature films at Sundance this year that were cut on Premiere Pro, which is great to hear as Avid’s position as the ‘must-use’ editing software is slowly being re-balanced after decades of a secure position in the hearts and minds of feature film, TV and documentary editor’s the world over.

If you want to wade through an epically long post on the making of David Fincher’s Gone Girl, also cut in Premiere, check out this previous post.

In this official Adobe video with Deadpool director Tim Miller you can hear about his experience of working with Adobe and in Premiere Pro and After Effects. Before getting into post Tim asked David Fincher which NLE he would recommend, and having just finished working on Gone Girl and enjoyed the process, suggested Premiere. Tim concluded:

I want to do that too. So the idea that this new version of Premiere was built from scratch with filmmakers, who are very picky and discerning, made me believe that it was going to be a product that was going to have some legs and was going to be here for a while and I wanted to work in the future, I didn’t want to go with the past and so Premiere seemed like the perfect choice for us.

It’s great to see such high profile films, and high profile filmmakers working with Premiere Pro, and to see that both character driven comedy drama and visual effects heavy, comic book action movies can find a home in the software. Here are the trailers for both film’s to whet your appetite!

A Quirky Way To Edit – How the Coen Brothers Edit

In watching the Adobe Sundance panel it was really interesting to hear how the Coen brother’s edit their films, which, like their writing and directing, they do together. From the Hail, Caesar! post team were post-supervisor and associate producer Catherine Farrell and additional editor Katie McQuerrey. They described some of the unique ways in which their post team functioned, and worked out a film-based workflow, whilst cutting in Premiere Pro. The panel was moderated by Adobe’s Mike Kanfer.

KMQ: When we were thinking about moving to Premiere Pro we met with David Fincher’s post team to see how it was going, and we kind of looked at the way they used Premiere, and what they were doing and we were just like NO. We don’t need any of that, we don’t work like that. We need this and and this and this. And they looked at us like – Woah?? How do you edit a movie like that? MK: In terms of keyboard short cuts and all that? CF: We don’t use them. (laugher) KMQ: We actually use a little pedal – that does actually let you use the N snapping tool, that we got on Amazon. We have two mice, we’re both mouse editors. They storyboard everything and they’re known for their particular process, and the fact that it’s all cut after shooting is very rare in filmmaking. MK: Tell us more about their process KMQ: Ethan finds the good takes, rings a bell, and then Joel picks it up and cuts it in. One of the things when we were transitioning from film to FCP7, their former post supervisor decided to do it on a film that was going to be a simple movie to cut, without a ton of VFX – Intolerable Cruelty. And we had lots of support. And he kind of felt like this was a good time to move to an NLE. And they tried to mimic exactly how they worked on film, and how they worked on film was that dailies were watched and then Ethan would pull the select takes, on a moviola and then those would be handed to Joel on the KEM and spliced together, and then all the manipulation that you have to do in editing, would happen then. It’s not like Ethan has cut it in his brain and that’s what you see on screen. So we basically had Ethan on one computer, he pulls takes, numbers them in the script order – line numbers – and then they’re assembled into Joel’s computer, and then from there, all the craft happens. And then when Ethan gets to a certain point in the scene, where you can actually start working – he rings the little bell that’s from (Barton Fink?) – that he rings, they’re very….. you know, editing, is a craft and a job for these guys.

MK: Take us through the film post-production process CF: We shot on 35mm – so it has to go through the photochemical, telecine process, FotoKem develops the negative, sends it over to EFILM who scans it and the colour grade is applied, it’s kind of baked in. It’s on our editorial quicklimes, and we basically don’t veer from that until the final DI and the final colour timing. We use that for previews and anything that needs to happen between shooting and finishing the movie. So Katie will get all the dailies from EFILM, she’ll prep the whole project and basically we’re kind of ready to go, the first day of post, when Joel and Ethan get in the cutting room. Vashi Nedomansky: Tim, any thoughts on film and digital? Tim Miller – I don’t know why anyone shoots on film, it just seems crazy? (Laughter) MK – Deakins had been shooting on digital, Adobe didn’t expect to have to deal with film, and so we had to find a way to handle key codes (numbers on the film negative a bit like timecode, that reference each frame for the negative cut). Even simple things like dailies, EFILM is used to putting out an Avid log exchange file, ALE – all the metadata of the dailies, which is a text file container that typically the Avid would read. We don’t work with that type of metadata, we do it another way. So we had to find ways to get EFILM to create a Premiere project that had all the dailies in it that went to Katie.

KMQ on Dynamic link

We did a tremendous amount of temp comps in Premiere, the dynamic link to After Effects got a little bogged down sometimes and it wasn’t 100% seamless, but we found the internal effects in Premiere we really good, really easy to use, and we also did a lot of split-screens and manipulation with time remaps, which were all done in Premiere.

KMQ on using the Media Browser

In Premiere you can only have one project open at a time, so in order to access to different versions of cuts, different visual effects you have to use Media Browser. Our cutting rooms are actually split up. I’m working down in Joel and Ethan’s Office, which is a pretty small space, and then we have another cutting room, about 4 blocks north of us. So we have to do a lot of communicating back and forth. And Media Browser, is this tool to open and look in other projects. It’s also our way of working in a sense. Because it was developed also to help Joel look at everything that Ethan’s doing. So Ethan can be working in a project, and we can open up his project real time on a different computer and see what he’s doing, as he’s doing it – after he’s rung the bell to tell us he’s finished doing it! But being able to open it up through media browser, was crucial to the editing process.

In this recent Adobe video you can hear from the whole post team on the Coen’s process of working in Adobe Premiere Pro. It’s a fun little video.

In this 8 minute video from Adobe Create, you can listen in on some thoughts on how filmmaking and editing has evolved in the 30 years since the Coen’s starting making films, in what are described as ‘outtakes’ from their main interview. This presumably is what went into the extended version (below) of the Youtube promo for the editing of the film, above.

This first video is a fun little trip down memory lane and an interesting window in the Coen’s mind!

UPDATE – Interview with Hail, Caesar! VFX Supervisor

The Verge has a nice interview with VFX Supervisor Dan Schrecker on how they worked hard to make some of the film’s visual effects look as dated as possible. It’s also worth scrubbing over the images in the post to check out the before and after comparisons.

We did was a full digital solution on the sub. Full CG sub, full CG water simulation — but with enough little details to make it look like a miniature. So the scale of the water and the depth of field stuff we tricked out to give the impression that this is how they might have done it back then. And again, we’re totally in this weird gray area, where it kind of looks the reality of the film, and it kind of looks like the fake movies they were making.

UPDATE: Every Frame a Painting

In the latest Every Frame a Painting you get a brilliant dissection of the shot-reverse shot technique as utilised by the brother’s Coen.

UPDATE – Additional Editor Interview

Oliver Peter’s interviews Katie McQuerrey, credited as an additional editor on numerous Coen films, about her role in helping the team transition to Adobe Premiere Pro for Hail, Caesar! over on CreativePlanetNetwork.com. As with all of Oliver’s writing – well worth a read!

“We were concerned about tracking keycode to turn over a cut list at the end of the job. Adobe even wrote us a build that included a metadata column for keycode. EFILM tracks their transfers internally, so their software would reference timecode back to the keycode in order to pull selects for the final scan and conform. At their suggestion, we used Change List software from Intelligent Assistance to provide a cut list, plus a standard EDL generated from Premiere Pro. In the end, the process wasn’t that much different after all.”

UPDATE – Hail, Caesar! Cinematography

You can hear from the man himself in this interview from Collider covering the topics of shooting Hail, Caesar!, BladeRunner 2 and the future of digital photography.

Editing Deadpool in Premiere Pro

Julian Clarke editor of District 9, Chappie, Elysium and Deadpool is interviewed by The Verge in this fairly short but interesting article. I’ll update this portion of the post with more from Julian as it becomes available.

As far as the editing goes, what is it particularly about the Deadpool character that created a challenge for you? Well, I think that the challenge with the Deadpool character — editing-wise and I think filmmaking-wise — is his greatest asset and also his greatest challenge. It is the sort of irreverent sense of humor and the meta thing, which is the thing that we love and the fans love, but it completely subverts dramatic tension, momentum, all these sorts of things that you need to kind of engage the audience in the movie.

UPDATE – Oliver Peter’s Deadpool Article

Oliver Peter’s has a nice write up on the film’s post production workflow, that’s well worth a read, and covers both creative and technical topics.

To make their collaborative workflow function, Nedomansky, Clarke, and the assistants worked out a structure for organizing files and Premiere Pro projects. Deadpool was broken into six reels, based on the approximate page count in the script where a reel break should occur. Every editor had their own folder on the Open Drives SAN containing only the most recent version of whatever project that they were working on. If Julian Clarke was done working on Reel 1, then that project file could be closed and moved from Clarke’s folder into the folder of one of the assistants. They would then open the project to add temporary sound effects or create some temporary visual effects. Meanwhile, Clarke would continue on Reel 2, which was located in his folder. By keeping only the active project file in the various folders and moving projects among editors’ folders, it would mimic the bin-locking method used in shared Avid workflows.

UPDATE – Art of the Cut Interview

Julian Clarke is interviewed in depth by Steve Hullfish over on PVC, on numerous topics including the difference in cutting action and comedy and the delicate balance of combining them into one seamless film. Well worth a read, as always. Here’s Julian on editing action vs comedy:

It’s very different. Absolutely. They’re very different instincts. Editing action is interesting. I always feel like I’m learning more about it the more I do it. But there really is something very musical about editing action. I think when you first start editing action you’re like, “I’ll find the best bits and the best stunts and I kind of put them together in a way that’s very kinetic and there it is.” But often, when you do that, you find that the flow of the action is emotionally unsatisfying. There’s a journey and an arc with rises and falls where your characters are winning and losing, so sometimes you have to re-order things to have this correct musical flow to the action. Then comedy is really this very micro-sense to what is the rhythm of this joke and how much of this character’s reaction do I need to see and then sometimes this line doesn’t work, we need a different punch-line to make this pay off and then in this movie we’re often joking in the middle of action or in the middle of drama and then it’s like, “How much of this can we get away with before the scene falls on its face.

UPDATE 2016 – AOTG Interview

Gordon Burkell of AOTG.com fame interviews Julian on the intricacies of editing Deadpool looking at “the humour, pacing and timing of a moment that breaks the fourth wall and how, by doing so, affects the momentum of the scene.” You can check out a full transcript here too.

The Cutting Room: Well, when you can see there’s a back and forth between T. J. Miller and Ryan Reynolds, and I heard that they ad libbed a lot in their scenes together, so what were some of the challenges that this presented to you, and how did you overcome them? Julian Clarke: In those sorts of situations, it’s just like an abundance of choice, you know, and it’s all funny. It becomes a focus on what’s the most funny, and also, sometimes something can be funny, but it feels kind of like it’s ‘just’ funny. It feels slightly, dramatically unreal or not quite in character or something like that, and so then this is really funny but it just slightly feels like just a funny line and less like in the emotion of the scene, and so then you have those sort of considerations as well.

Just saw #Deadpool and now a Q and A with the fantastic post team! pic.twitter.com/dHqjkI9FW9 — Jason Bowdach (@JBowdacious) February 14, 2016

The Deadpool editorial team, including Vashi, gave a post screening Q+A on the Fox lot which will appear online in about a week. Until then here are some highlights thanks to Jason Bowdach’s live tweets of the event.

"Locking a project while in use in crucial to work with multiple editors and very possible in Premiere Pro" — Jason Bowdach (@JBowdacious) February 14, 2016

#Deadpool had significant amounts of ADR, similar to Top Gun, due to the mask — Jason Bowdach (@JBowdacious) February 14, 2016

The #Deadpool production burned through 10 Mac Pro "trash cans", which routinely failed and had to be rotated out regularly. — Jason Bowdach (@JBowdacious) February 14, 2016

#Deadpool main camera was shot on Arri Alexa at 3.5K ArriRaw, with Red and phantom used as secondary cameras . — Jason Bowdach (@JBowdacious) February 14, 2016

The offline edit was created using ProRes files, which were onlined at EFILM to the camera ArriRaw files or EXR files. #Deadpool — Jason Bowdach (@JBowdacious) February 14, 2016

color was burned into the dailies. Final color was performed at EFILM using ArriRaw & OpenEXR (VFX). Raw was used when possible #Deadpool — Jason Bowdach (@JBowdacious) February 14, 2016

There was two full weeks of reframing done to #Deadpool in color at EFILM.

The open gate have them a lot more options — Jason Bowdach (@JBowdacious) February 14, 2016

@3rdPassMedia The Premiere Pro Change List tool isn’t released yet but a beta can be obtained directly from https://t.co/mMryHpgzpK — Philip Hodgetts (@philiphodgetts) February 15, 2016

Here is another interesting tidbit of post production info and it will be exciting to see the release of this (and other feature workflow third party tools) for Premiere too.

Deadpool Visual Effects

FX Guide has a fantastic breakdown of the visual effects involved in several key sequences in the film, which is provides an excellent insight into some of the creative techniques involved. If you’re into VFX, this is a must read.

Based on previs by Blur, production gathered background plates while filming in Detroit with a seven-camera RED DRAGON rig. Rothbart and DOP Ken Seng then devised a lightbox system for the greenscreen shoot. “We set up these LED panels all around the car using the Detroit plates,” explains Rothbart. “I lined them all up in the seven camera view and edited it so we had certain chunks we would use for each part of the sequence and then could map it out. For example, where the fight takes place in a tunnel, we had our tunnels loop. So the lights would dim, everything would go black and then it would come back to normal lighting as they come out of the tunnel. It meant that later on we got all that interactivity when we put the CG environment in.

UPDATE – Title Sequence Breakdown

The Art of The Title has a brilliant breakdown and interview with director Tim Miller and Layout Supervisor Franck Balson on the creation of the film’s title sequence. It’s a great read and features early pre-vis videos and other behind the scenes goodies.

How was the sequence mapped out? Did you create a storyboard or simply work from the previs test? Franck: It was only previs actually. The more [Blur] has worked on game cinematics the more we’ve realized that one of the quickest and best ways to try things out is previs. Boards are still a great tool but they only get you so far. In the context of this, where everything has to do with the placement of objects, you can discover so many things, those happy accidents, by working in this concrete computer generated world.

Vashi Nedomansky – Premiere Pro Coach to Top Editors

Vashi was generous enough to free up a good chunk of time to chat with me about his experience of cutting features in Premiere and how he approaches helping some of the industry’s most established editors make the switch (usually from Avid Media Composer) to Premiere Pro. I had a really good time chatting with him and learned a lot in the process. What resulted is a really, really long interview (7500 words!) that I’ve transcribed below. It’s so long in fact I thought you might want to download it and read it offline, so grab the text only PDF here, if you do.

A few headlines to whet your appetite:

Details of Vashi’s process as a workflow consultant on films like Gone Girl and Deadpool

Why we’re all benefiting from Fincher’s interaction with Adobe

Download free Premiere Pro effect presets used on Deadpool

Tips and tricks any Premiere editor can use to improve their workflow

If you want even more from Vashi on his approach to editing feature film’s in Premiere Pro, including several more video presentations check out this previous post.

UPDATE – NAB 2016

Vashi gave a presentation on Deadpool to a packed crowd at the Adobe booth at NAB 2016. If that becomes available online I’ll update this post with it. In the mean time Vashi shared this screengrab of the actual Deadpool Premiere Pro timeline.

UPDATE – Steve Hullfish has also interviewed Vashi as part of his Art of The Cut series, which is well worth a read over on Pro Video Coalition.

HULLFISH: Tell me about how multi-cam was used and how many cameras were going on and how were they organizing the multi-cam in the bins. NEDOMANSKY: Matt was getting them in and syncing them – just grab two shots, right-click, Make Multi-cam. You get to choose which audio channel is going to be playing initially so you know you’re just getting the stereo mix usually. If you wanted to assign it to a lav, it would create the multi-clip with just that, and obviously you could change that later. So he could also do it by bulk. Grab an entire folder of a scene and grab all of them you can quickly multi-cam clip everything in there and make your isolations visible. Most of the time it was two cameras. On all of the heavier action scenes and fight scenes there were definitely more cameras. I think Tim said that at some point they were using seven or eight on some of the bigger gags. But most of it was two camera.

Interview with Editor Vashi Nedomansky on Deadpool’s Premiere Pro Post Workflow

How did you become the go-to Premiere workflow specialist?

I started working at Bandito Brothers in 2006 as their lead editor, and I was there for about four years, and we used all Adobe software. They mostly did commercials but they also did a feature called Dust to Glory. And they used pretty much every camera on the market, and so they had to find some way to conform and cut it all in one program. So that’s how I started cutting on Premiere, but the problem was it was still really shitty back then, and it would crash all the time, but Adobe eventually worked through it, but essentially I’ve been cutting on it for a long time.

After my experience at Bandito Brothers I met some people from Adobe and when they started bringing other editors on board from other productions, including Gone Girl, Deadpool, and a lot of stuff before that, they asked me if I could go in for a week or two or three and work with the editors.

Having had plenty of experience cutting features, commercials and documentaries on it already, I could explain it’s strengths and weaknesses and I could also help them make the transition [from Avid], because I personally had already made the transition from one platform to another, many years before.

How do you know, what you know? Is it just experience, or is it a deeper technical knowledge of what Premiere is doing under the hood, because of your relationship with Adobe, that allows you to successfully plan a workflow and avoid any pitfalls that may be lurking along the way?

I’ve cut four feature films on Premiere Pro, and before that, so many commercials and music videos and everything else that all of us editors do. I think a lot of it has to do with a preternatural feeling of knowing what it can handle, I know what’s going on, there’s certain bugs – I’m sure you remember in FCP7 if you dragged a clip across one of the windows, you knew it was going to crash. It was lagging for a second and I’d go ‘Oh no here it comes… and crash. YAY!”

It’s like having a sixth sense. I’ve been cutting on Premiere for almost ten years now, through every iteration, and so I do know the ins and outs, specific shortcuts, tricks and secrets that a lot of people probably don’t know. And that’s gathered from experience and learning from other editors.

For example, last year I had a lot of comments about the ‘Pancake Timeline’. I came up with the name for it, but I obviously didn’t come up with what the pancake timeline is. But I gave it a catchy name so everyone could understand right away what it was.

That came about when I went into Fincher’s post house, they were using FCP7 before Gone Girl, and they were all using the ‘pancake timeline’ or whatever you want to call it – stacked timelines at that point.

And so I had seen it before, and I had used it, but when you see Oscar winning editors using it, and you see how fast their workflow was, and how easy it was to plough through 20, 30, 40 hours of footage and be able to quickly make a selects timeline.

It’s things like that, that you wouldn’t see unless you’re in that room, that I can share with people. I can’t tell you how many people have thanked me for introducing the concept, even though I had nothing to do with creating it. I just put it out there and said this is really efficient and shared knowledge.

And I think that’s a titbit that people didn’t use before, or didn’t know it existed or was possible to do, but now it’s very commonplace. Everywhere I go they’re doing that because it’s efficient.

That’s a great tip and I’ve really benefitted from it myself, especially after setting up a macro for it on my Wacom Tablet. [You can check out a post on how to do this over on RocketStock.com and Vashi’s original pancake timeline post, here, and follow up post here.]

As editors we’re always looking for shortcuts that allows us to be more creative, and shorten the actual mechanics, so you have more time to be creative afterwards. We’re always up against a deadline, and there’s always more we want to do, or ‘I wish I had another hour left to do an audio pass, so I can make this thing shine’ but you so often don’t. I’m always on the look out for those shortcuts and tricks.

How did you set up an Adobe centric workflow for Deadpool?

When I was first hired I actually went in to meet with the Deadpool team before they even made the decision of what cameras to use, what codec they were going to shoot in, the resolution, the frame size, what they were going to cut at, etc.

Needing answers to all those questions it was great to be brought in at the beginning so we could create a fresh starting point, because the last thing you want to do as an editor, and especially as a post production workflow specialist, is to get into something that’s already been set up and say ‘I’d like to change stuff, because it will work better’ but it’s already too late. I also helped to build out their physical studio base too.

Blur is an animation company, but they actually built five professional, beautiful new edit bays inside their studio, in preparation for cutting Deadpool in-house and with the added benefit for future productions being that they can hit the ground running systems that work. So they cut no corners when it came to building those rooms out, and the Opendrive system of solid state shared storage, is huge and fast, so everyone could pull from it, in terms of the storage needs.

The first conversation with Tim Miller and his wife Jennifer, who run Blur Studios was to discuss the camera choice and go over the possibilities. We shot on ARRI Alexa at 2.8 open gate, but we also wanted to find out whether there was an option of cutting in native ARRI RAW or would we have to do conversions to Pro Res etc., so we ran some tests.

Because we shot it open gate we had to crop the centre of the image, for the dailies, but because of the way they shoot with the opengate, that centre cut is actually raised, its not a centre cut in the middle of the frame, it’s raised up another 25%, because that’s how the camera lens shoots it.

So that’s not just a simple centre cut in Premiere Pro, if you just made a timeline it would be offset incorrectly, so we had to create projects set up to correctly offset the centre cut, so it would play regularly in Premiere Pro, but also but have the burn codes and the timecodes and everything else above and below the picture so that when the studio watched the dailies they could see which take it was and so on.

So a lot of my process was setting up these projects so they would play back in the right position with the right offset because we were delivering in anamorphic 2.39, so the image itself was 1.85 or 1.76. So that was one challenge.

The other challenge was setting up different projects. One project would have all the footage in for the studio, so they could look at the dailies, in their own auditorium or in their own edit bays. That was set up with the centre cut but open so you could see the timecode displays.

But while we’re cutting we didn’t want to see all that information, we just wanted to see the image cropped and matted as it would look in the theatre, so in the cutting room we could be overwhelmed by the dramatic moment, not by timecodes going by. As an editor it really is distracting to see that popping up all the time.

Having helped out a lot on Gone Girl, we got a really solid system going there and that was a huge benefit to us when we started on Deadpool. During Gone Girl David Fincher, Kirk Baxter, Angus Wall and Tyler Nelson really, really, tore down Premiere Pro in the best way possible, and came up with a huge number of requests – ‘we need it to do this, or I’d like it to do that, or this is critical, we can’t move forward unless it does this.’

Adobe really went above and beyond to have engineers on hand, making these changes and actually writing the new code into the software during the months that Gone Girl was in post.

And so the latest version that we started on a year ago for Deadpool, that’s the version that was released to the public where it had every Fincher fix and every post production item addressed, and I think it was something like more than 200 requests, over Gone Girl’s 14-16 month of post.

So all those things, were things that Adobe didn’t already have in there, because they needed to hear from working people, ‘this is what we would like’, because no company can foresee what everyone wants and stay ahead of the curve, they’re basically playing catch up all the time.

But they had that time period to really invest in themselves and now we are all working with basically the Fincher version, that has all these options that weren’t there before and allows us to be more productive and more efficient and open to new workflows, moving forward.

What would be some key examples of a critical Fincher request?

Two that apply to everyone, are project load times and Render and Replace.

Fincher’s film had 500 hours of footage, obviously one project could hold it all, but it would take 10 minutes to open that project, and if you want to switch to another project – the same thing, and then come back to it, it takes way too long.

So the engineers and Jeff Brue, the owner of Open Drives, together they wrote, some kind of code that by a factor of 10, reduced the loading time from 10 minutes to one minute. That happened in the middle of Gone Girl post.

The load times are based on the time it takes to actually access the metadata because all that Premiere is doing is check summing all the files, checking their location, checking that nothing’s changed, that’s all it’s doing.

Is that a lot to do with the fast (Open Drives) storage?

It’s not the storage. It’s not how it’s pinging it. This is code that’s in Premiere Pro, it has nothing to do with the storage, this could be applied to a USB 3 drive, or a FW800 drive for that matter. It’s just how Premiere Pro sees the metadata and how it pings it when it’s opening, to get you there faster.

On Gone Girl they had six reels, each one 20-25 minutes, and on Deadpool the same thing. We ended up with 6 reels, because it was about an hour fifty film. We found that to be the best way, especially for the team coming from Avid, where they were used to working in reels and so we want to make it comfortable for them. And also for passing off to VFX and sound it helps to make divide it in reels, instead of just one long project.

Which can also be done – I’ve done that many times. I’ve cut a documentary, last year, That Which I love Destroys Me, on PTSD, that was one reel. An hour and half, with 9 different formats, all real time, all native codecs.

I also cut Sharknado 2, all on one timeline, because I like to see a bird’s eye view of the whole timeline, because you can gather a lot of information like cutting tempo, pacing, all that kind of stuff – you can see it if you zoom out. So I like to have that in one timeline, if the platform allows me to.

Did they have a one-project one-timeline, slimmed down screener project?

They have a screening project which is basically just the current sequences. So it’s basically an empty project where you import the latest 6 reels and that’s it. It will import just the footage used in those 6 reels, and the assets (via the media browser), and it is a lot slimmer obviously – it only has an hour and 50 minutes of footage in that one project.

And that’s the project we would use to screen for studios, or that’s what we would use to make a DCP for another screening, or make a Pro Res or whatever. If we wanted to watch something internally, we’d have one project that just had those six reels in it.

And as we talked about at Sundance, the naming conventions are the only thing that changes. If you’re coming from Avid, a Bin is now basically a Project. Once you make that equation you’re like “OK I get it, I just got to open that project/bin to get that information.”

What was the workflow process for screenings and viewings?

We played back from the timeline to a 65” 4K Sony TV for reviews in Julian’s main edit bay with director Tim Miller and for other private screenings. For screenings at FOX we would either send over a Premiere Pro Project that would re-link to their drive of cloned footage, or we mostly created flattened Quicktimes of each reel, with 5.1 audio, and played them back using Premiere on a studio workstation that had a AJA Kona 3G card in it.

The second Fincher feature we’re all benefiting from is?

If you had lots of dynamic linking going on in a timeline and you have 10-20 things, all dynamically linked it gets… sluggish. It’s pinging After Effects, the storage, Premiere Pro and so on. So one of the biggest things that Fincher’s team asked for, and they got, was Render and Replace, that’s directly from them.

They said, once we’ve made our change in After Effects we’d like to be able to render out and replace it in the Premiere timeline so it’s not taking the hit on, not only the RAM but also the storage and on After Effects.

So they wrote that, and I think that’s huge. Once you’ve signed off on a shot you can render it out, and you always have the option to go back, and un-render it and open up the After Effects project. So nothing’s ever completely destructive or irreversible.

Working with a rendered file, that’s easy for Premiere to deal with. So those two things I think are huge, and are things that anyone in any workflow can benefit from.

How does Premiere and shared storage work within a team of assistants and editors passing work around?

We had a couple of options, we went out to a couple of vendors, to see some of the parameters. I tried EVO SNS, which has a proprietary file browser software for Mac where you can lock projects and stuff. It’s really, really cool, but I don’t think anyone wanted to take a chance at that stage, with third party, because then you’re introducing another thing, where it’s out of Adobe’s hands and it’s out of Blur Studio’s hands.

I personally am going to be using the EVO SNS for my projects because I just like the way it’s set up.

That said, we had to find a way to keep everyone on the same page, and so what we did was to create, on our main shared storage drive where we had all our footage, one folder for each person.

So Julian Clarke the editor had his folder, Matt the first assistant had his folder, VFX had their folder, and so on. There were five folders each with the person or department’s name on it, and inside that folder you would keep your latest Premiere project with your current sequences in it and only your most current work.

For example, Julian was working with six reels and so he had six projects in his folder, one for each reel. If you needed something you knew that that would be the latest version. No matter what.

So if you needed to grab something, you could go through media browser, open, inspect and just take whatever you want and because it’s non destructive you knew you couldn’t mess up that project.

There’s nothing you could do to break it as it only pointing to it. So we didn’t have to worry about locking stuff [the Avid way] because you can’t break it. All you had to do was communicate that ‘’oh go in my folder, I just finished Reel 3’’ and then grab that and update the master project.

What was the process for turning over to other departments like sound or VFX?

Every studio has their own deliverables list for all of this, so it is different per studio. With this, they wanted the Avid Log Exchange, like from Avid, and so we had to do our version of that. [Ed. note – Mike Kanfer from Adobe speaks to this in the Coen brothers editing section above.]

For the final audio we just exported an AAF along with a reference video file to bounce to Pro Tools for the post sound teams to do their work. That’s the same as for anyone.

Handling the VFX internally, we had a VFX editor in one of the other bays managing the flow of 1100 shots.

We’d start off with basically stick figures, we had drawings that Tim had done and the art team had done. And those were in there being animated, just to have the timing of the shot. The timing is the most critical thing for editorial, because if you time it wrong, and it’s too long the time and money it costs to make those extra six frames of animation, it’s crazy! So you have to time it realistically, but have some handles, but you can’t get carried away. But all the time you’ve got to maintain that flow of turning over to VFX so they can create the final shots.

So we were constantly swapping out those shots, first with stick figures, then drawings that were animated, and then low-res animatics, and then what’s amazing because Blur is so amazing at the animations – the first low-res things looked insane! How do you do that?! By the next version it would be almost the final, I mean you can make tweaks, but….

Having everyone inside one building I think is a trend that’s happening more and more. If you look at Fincher and Gone Girl, everyone’s internally there. You look at JJ Abrams and Star Wars at Bad Robot in Santa Monica, they have not only all their scoring and mixing but their VFX and green screens, in house. Nothing ever leaves.

I think that’s the way of the future, only because the technology has come to a point where it’s not only affordable, but small enough and controllable, and it’s not like 15 years ago in a studio, where it’s $150,000 per Avid, and it has to stay in this room – and that’s only for cutting offline.

These days for a short film you can literally cut at 4K or 6K and swap out your VFX and deliver your DCP in a couple of days.

The temp VFX were created in After Effects. I believe all the masters, the final VFX shots, were done in NUKE, and they were just swapped out, into our timeline, as normal.

But we did try and empower both the assistants and Julian to know that if they wanted to add this move, or a little effect or whatever, then they could do it in Premiere or use dynamic link and do it in After Effects, for themselves. Only because as you get closer to the finish line if those guys can do it then it’s not a long process.

Not like back in the olden days – listen to grandpa here – you know you’d export a QuickTime file, open it in After Effects, do your work, export another QuickTime file, import that back into premiere pro. Just to do one change! Just to do one stupid little thing it would take so much time and you’d create so much more data and files, that would just clog up everything. And then you’d forget, you’d be numbering, and forget – is that the last version? You wouldn’t even know.

That was the benefit of doing as much temp VFX as possible, and also more importantly to update each current version, in Premiere Pro, without having to do that back and forth, passing video files everywhere. That was a major benefit.

What was Julian and Matt’s experience of cutting in Premiere? Did they do much temp colour correction or audio mixing etc.?

The role of the editor these days is to deliver a cut that is as close to final as possible, given the circumstances. Any editor that’s worth their salt will make sure there’s no shot that stands out, that wasn’t colour corrected suitably. Not that you have to do a colour correction pass, but like if there’s a bummer – a bad shot – you’ve got to fix that shot so it blends with the other stuff as much as possible. That’s a no brainer. So we did use temp S-Curves, just to give it some more contrast and mattes and whatever to make it look as good as possible.

Audio wise. They were super impressed with not only the mixer, but the real-time audio and the fact that you can layer as many audio tracks as you want in Premiere Pro, and can continually get the real time playback.

I’m sure you remember the FCP7 days, you’d have a good few audio tracks and you’d get the beep, beep, beep, and you’re like I can’t play audio back! I mean come on. And then you shift one of the tracks, one frame, and you have to re-render the whole thing.

So being able to dig deep, and I’m a huge proponent of building huge audio, as deep as possible and as close to final as possible. They were building really deep.

I created 12 audio presets for them, that they ended up using quite a bit, specific to this film. Things like ringouts that they wanted or special garbled voice changers, that are all built with limiters and compressors and high pass and distortion. So I built about a dozen presets for them that they used a lot.

[Download some of Vashi’s previous audio presets for Premiere Pro here]

UPDATE – Vashi put together this demonstration of the camera shake presets, which you can check out in more detail here. Further Update – You can now download all the actual full-length presets as used by the Deadpool team, courtesy of Jarle, over on Premiere Pro.net. He also explains why you might want to turn on Motion Blur, via After Effects, on the shakiest presets.

Our good friend Jarle from Norway, his 98 free Premiere Pro presets, I brought those in to Deadpool, and they ended up using a lot of those, only because they’re all built on the internal effects that are in Premiere and because of the real-time.

The brilliant creative energy and intelligence that Jarle has to make simple stuff that works, and doesn’t tax the system, made the Deadpool team use them a lot.

He actually created 3 or 4 handheld camera effects, that we used throughout the Deadpool production, by shooting with real cameras – both wide lens, long lens, handy cam and so on, and then taking the tracked motion and putting them into the actual Premiere Pro preset.

So it’s not artificial, it’s not a random wiggle from After Effects, it’s actual footage mapped into Premiere Pro presets. And I think one of them is in his latest release, but we were lucky enough to have three or four different ones, just to help with certain shots that were locked off, that we wanted to give a little bit of life to.

Things like that are really cool. Between the audio presets and the video presets it just gave us more options, in the moment, to add that feeling that we needed, without having to hand it off to a dedicated VFX or audio team, we could just do it ourselves.

How did you work with the Deadpool post team – what did that look like?

When I first sat down with the post team for Deadpool, every single one of them had worked on Avid on their last several projects, they were a team, they had an established workflow and they had a way of doing things. So my first and foremost job was to gain their trust as an editor, not as a salesman – because I don’t work for Adobe. I was hired by Blur and hired by Fox for this production.

I sat down with each of them individually for two or three days to get them up and running and the first thing I did was to customise their Premiere Pro set up so it matched, as close as possible, their Avid system. Because as an editor – a lot of it’s about comfort, speed, keyboard shortcuts and knowing where everything is.

So I literally mapped out how their Avid keyboards were in Premiere, or showed them where that was in Premiere, and it’s funny – even though they were all Avid people, they had their keyboards set up a different way. None of them had the default Avid keyboard, they all had their own custom stuff. I also set up their two monitor layouts, got everything where they wanted. They were really impressed that they could resize stuff, move stuff, tab stuff.

UPDATE – You can download the two-monitor Premiere Project that Vashi created as a starting point for the Deadpool editors here.

And then the tilde key! [~] Every time I show an editor that’s never used Premiere the tilde key, they literally lose their shit. Just being able to go full screen with one button! It’s so simple. I can’t believe that every NLE doesn’t have it.

Another thing is there’s a lot of preferences to set in Premiere Pro, both under Preferences and also under the Project Settings. All of those have to be mapped out properly as well, before any computer starts.

Once we started sharing projects one thing that we didn’t realise, was that some of the systems in the 5 edit bays, didn’t have all the presets matched exactly. So when they saved the project on that system, it would open up on another computer and those other settings, would superseded the first ones, because the settings are project based.

For example, if we had a 4K timeline, and you dropped a 2K clip in there, we had the preferences set up so that it would blow it up to fit the 4K timeline (via the media tab in preferences). If you didn’t set that, it would just be a small 2K image floating inside the 4K timeline.

So some of the systems didn’t have this, and so we had a situation where initially stuff would be different sizes on different systems, and so people were shouting ‘’What’s going on? Premiere’s crashing! Premiere’s changing all the footage!”

But it was a question of – I didn’t go to every computer and make sure that every preference was identical before we started. So that’s on me. And so when someone opened their own personal project that had different settings and then they saved their latest Deadpool cut it superseded what I already did. So it took a couple of days, of the project’s rippling through, before everything was the same.

Now I think you can lock the preferences and assign them so you don’t have to deal with that. But that was a quick little learning curve and it was out of the blue – things are changing on different computers and we didn’t know why, until we realised – the carry over from the preferences. So little things like that came up.

Preferences can make a huge difference on how your system works. For example assigning the maximum RAM, a lot of people don’t go that deep. They’re like whatever, I got 32GB, however it’s assigned is fine. No. Put it all towards Premiere Pro, because you’re going to get better performance.

For the last chunk of that first three weeks, my job was just to bounce around from edit bay to edit bay because everyone had questions. How do you do this again? Do it like that, oh yeah that’s right I forgot. And then I’d go to the next room and they’d have 10 questions and I’d plough through that.

These editors and assistants are so intelligent that they pick up stuff so quickly, they see it once and they remember it, they just have to ask – it’s easier to ask a person than looking it up on Google.

I did have a manual that I created – a top secret 25 page best practices book, that I created for another job, which I customised for them. So they all had hard copy print outs of that in their edit bays that they could reference.

It covers everything from ingest to export and that’s something that when I get hired, I can bring to the table and I tailor it to the project, because every project is different. I use the same book myself, making sure I’m doing everything in the right order and the right way.

After the initial three weeks, I came back almost every month for a day or two, just to pop in, see how everything is going. I’d make sure everything is optimised, answer any questions that are popping up.

At one point they did bring in a couple of other editors who needed to be trained on Premiere. So they went through a crash course where they had two days with me each. One guy was coming from FCP7, another from Avid. They needed to be trained quickly and to be shown how everything works. Again that’s my job. And those guys were cutting as well.

There’s huge deadlines for Deadpool, it comes out now in 11 days. They really pushed the limits of deliverables and deadlines on this. But that always happens. You’re going to keep cutting, and post’s going to keep going until they literally take the film away.

But the rewards are on the screen, because it’s a fantastic movie, I love it, I’ve seen it about 5 times now, five different versions that is.

Why do you think people are still entrenched in Avid?

My take on all that, is that for the whole editing team, whether for an editor or an assistant, is that we basically do about 10 functions all day long, consistently. Marking in/out, insert something, delete, ripple delete, slide something a frame or two here and there. It’s not like we’re using every function a program has to offer all day long. But you do need to know what they can do and you need to be able to access it all quickly.

Avid’s strengths are definitely shared storage and the stability, over the long haul, especially in the studios. That’s a given. That’s a focus of Premiere Pro, but I think the functionality, you know the Lumetri colour engine, the audio capabilities are far and away better than any other platform that I’ve ever worked on.

The only thing that I think needs to match that functionality is the shared storage, where people can work on it safely and share easily and have nothing bog them down.

Just before one of these projects I worked at Time Warner Cable Sports, in LA. I trained 72 editors as they switched over from FCP7 to Premiere Pro. And these guys are putting out 6, half hour shows, every day. They’re using Premiere Pro and making deadlines and working with shared storage.

So in an extreme case like that, where, after the initial couple of weeks, everything was fine, that speaks volumes. That’s insane! 6 live shows, 60/70 editors all working together every day, so that was my main concern – shared storage. But there were no doubts over the storage, nothing ever slowed down. For them, they were concerned about going in was, how do we share stuff as obviously in Avid it’s a normal thing.

For the Deadpool team their biggest surprise was the real-time playback of everything and the audio-depth that they could build out. Deadpool relies hugely on music and sound effects – as most action films do, so the fact that they could just go really deep, make the most of the interface and the plugins we had. There were no major complaints, and I would tell you if there was something nagging at them.

There wasn’t almost because there was no time to nag. We just had to cut and move forward. It was a very clean process, cleaner than I expected, and it was a huge commitment on everyone’s part – to roll the dice, especially coming from an Avid team.

I think it took the editors and assistant editors a couple of days to figure out where everything was, what the shortcuts are, and at the point you’re off and running and there’s almost no time to complain. Bottom line there were no show-stoppers on the this film. Nothing ever happened where a machine went down, or Premiere went down, where anyone lost time or a shot or a cut or anything like that. That’s the most important thing, as long as the film is ploughing forward and you’re not being limited by the technology, or an interface that isn’t doing what you want it to, I think you’re in a good spot.

Personally, because Premiere is so customisable, every single one of the Avid editors made it exactly how they wanted, so there was nothing but nice words. With Avid, you can’t really customise it very much you’re kind of locked in to that system, and it’s layout.

Do you get to input into Adobe’s development process?

I’m really happy that they’re the first platform to bring native support for the H.265 codec. I’ve been shooting a lot of projects on the Samsung NX1, over the last year. That surprised the hell out of me, when I heard they were going to do that, and I got the beta and it worked quite well right away, and then by the first public iteration it was flawless, that really impressed me. Because I had been converting the files to Pro Res and the files were five times bigger.

I’m constantly surprised by the stuff they do come out with, because it’s either something I didn’t think about, or something that I’m like ‘wow’ that’s going to help me a lot.

Whatever platform you’re on, as long as the core functionality is fast, stable and customisable – so much of what we do as editors, is just looking at footage, picking shots and moving it around and making adjustments.

I’ve cut features on Avid, on FCP7 and I use Resolve to edit sometimes. I listen to the podcasts of supporters of all the other platforms, because I want to make sure I’m not missing out on something. That would be criminal, as an editor. If something was a breakthrough, or on the verge of breaking through, and it can help you improve your workflow, then you should be smart enough to try it out and to stay on top of that. That’s your job.

I was a professional hockey player for ten years, and if I didn’t exercise and work out every day, I’m going to lose my job, because I’m trailing and losing my touch and not doing the core things that I have to do, to be good at my job.

Being an editor, a lot of it is creative obviously, a lot of it’s interacting with people and being a good person in a room, but a lot of it is being technically adapt and being ahead of the curve, if possible. Again, anything you can do that will get you more time to be creative, will make your stories better, will make your interactions with the director better and will make your end product better.

I’m always on top of everything. I’m cutting on whatever I have, to see – maybe this is better? Who knows!

What’s next for you personally? Are you still working on Grind?

Grind is still in post – I’ve got to finish that up eventually. It’s just so much time. If I sat down for 3 months, and did everything myself I could finish it. I don’t have three months right now to sit down and finish it. I’ll get that done. The story is the story and at first I was worried – you know when we shot it, we had iPhone 2s in it. If someone watches it now they’ll be like – iPhone 2’s?! What is this?

I’m sure we’ll have to put a card up front and set it in the time it was. So I’ll chip away at that. But right now I’m also shooting a documentary on my father actually. He was the first player to defect from a communist country and play in the national hockey league, back in the 70s.

It happened during the cold war, he was literally chased by KGB spies across the planet. The FBI was also after him because they didn’t know if he was a spy or not. Athletes defecting and going from Czechoslovakia to Canada and then Detroit – didn’t happen. Now it happens all the time but for anyone who’s the first at something, that’s a singular moment and a singular event. And I wanted to explain that process.

The film is more of an intrigue/thriller because it’s crazy. It’s absolutely crazy true story. The other thing about the story is that we left a communist system where everything is controlled, and you have no rights, no freedoms and we came to the West, where it’s capitalism, and you assume the movie ends when he escapes and he’s free. Well, when he arrived he got screwed over by lawyers and all sorts of people, because he just assumed everyone was friendly nice and out they were looking out for his best interests. So it’s also a dialogue between capitalism and communism in one person’s story, and how if you’re not careful, anything can happen.

I’m shooting a lot of recreations, going back to Prague shooting a lot of stuff there. It’s not going to be talking heads. It’s not going to be a sit down thing. It’s going to have a very cinematic visual style, very video orientated with audio when you need it.

When you’re watching something and the story is amazing, but you’re just watching people talk, and oh look, there’s a photograph. That’s been done to death!

It’s fine, you know, but I believe we have a responsibility as filmmakers to make it as interesting as we can. Narrative filmmaking and documentary filmmaking are merging everyday, and sometimes you don’t even know if you’re watching a documentary. And that’s what I’m trying to give, is that feeling, more of an entertaining cinematic look, but also obviously tell the story faithfully, but not make it fucking boring. Because that can happen so quickly, so I want to avoid that obviously.

The other project I’m starting on shortly is a sci-fi film, 10 million dollar budget indie film, called Jessica Frost. I read the script and loved it and we should be starting soon. It’s kind of like Mad Max: Fury Road meets The Matrix, out in a desert.

It sounded like an amazing project, and I’m just waiting for them, they’re in preproduction now and we should know in the next month or so, when it starts going. But I’m excited to dig into that, as my next feature.

How do you pick your projects, as an editor?

Some things are time related. Each project is different and it’s kind of what your soul needs at that moment. And sometimes what my soul needs is money, and I got to go take some jobs. I’ll cut most any short film or commercial, or whatever my agent sends for that stuff. Those projects are a couple of days or a week at the most.

Feature films are different – you have to be ready to invest 6 months, 9 months maybe a year of your time with a small selection of people and try to tell the story as best you can, so you have to make the right decision.

Because I have worked on projects where you’re into it, and you’re like fuck, I really shouldn’t have taken this. I should have seen the red flags. So for me, it’s getting ready to commit such a huge amount of time.

Everything I do is based on the script and the story first. If I can connect with that, if I can read it on the page and I can see it in my head like I’m cutting it already, and it’s entertaining me and challenging me, then I’m very interested in that project.

But besides that, you only have so much time. If you do one film a year, or two feature films in a year, you’re working the whole year. There are no vacations. You can’t just take off for a week. The hours get crazy and so you have to be ready to commit and you have to be confident in that decision. It’s your family time that you’re taking away from. You want to make sure you pick the right project where you’re really enthused about it.

I always like to be working and cutting. So if I’m not working on a feature, then I’m doing my own stuff – I have written five or six scripts I’m trying to get funding for those myself, to direct and what not. It’s always an on going process.

With Sharknado 2, I wanted to cut a shark film, but I also wanted to see how the Asylum made their movies. I’ve talked about this before but they put out 24 films a year, each of the budgets is capped at $1.5 million more or less, they’ve never been late, they’ve never lost money, they’ve always hit their broadcast dates and that’s amazing! I wanted to see how they did it.

And I learned a lot. I got to go behind the curtain to see how they pull it off and that’s invaluable. As filmmakers we should always be looking at working models that actually deliver the goods.

For each film it’s a different reason for me, but it’s always for what I hope is in the best interests for my career and my future.

I like flexing different muscles. I think editorial is an amazing job in that you never know what the next project is going to be, there’s always variety and it’s always changing. I couldn’t imagine doing the same thing every day, knowing what my next week would be like, knowing what my next year would be like!

I’m super, super lucky though and it takes a long time to get to a point where you get approached with projects. And, like every editor, I spent the first three or four years cutting the worst projects known to man, the worst shot, everything was terrible, not paid for any of it. But just to get some credits and to figure out how to edit. I read every book, I went to film school, but until you work on your first project in the real world, none of that matters.

Those early days of struggle are so helpful because you have to learn to solve all sorts of technical problems and basic construction problems before you even get to the creative problems. Later on in your career the problems you have are thankfully largely to do with story, the characters and creative nuances.

Right, the further along in your career, it’s definitely more story based problems that you’re solving, because the technical aspects have all been taken care of for the most part, and you’re just trying to solve moments, and scenes.

But I agree with you, at the start I learned so many tricks by having to creatively solve story and technical issues that were just horrifying. Like a guy gets up from one table and exists camera left, and then he enters from the wrong side, and that’s all they have. And you’re like – who shot this?

Audio is always the worst initially. No lav or boom, it’s just on-board camera. And they’re like “we want to submit this to Sundance!” and I’m like yeah? But it sounds like shitballs, you can’t submit this anywhere!

I try and encourage the camera folk I work with to come in and review the rushes with me, so they can see how an editor looks at their footage, and what’s useful and what’s not, to get them to think about it through someone else’s eyes.

I agree with you completely, and that’s why, when I’m directing in the future, I’m glad I’ve already done every job on a film set and in the post-production world. I think the more you know about the different jobs, the better a filmmaker you are, and the better you are at your job.

An editor that’s been on set, and talked to the cameramen, is going to be better than someone who just sits in a room and says “give me the footage, I’m going to cut it.” You’re limiting yourself.

In the changing world of post-production today, where editors have to be better at everything, and if you don’t go out and see how the other professionals are doing it, at the highest level, you’re just short-changing yourself. You’ll be more valuable the more you see, as, the more you’ll learn.

Thanks Vashi for giving up so much of your time to chat with me today, I really appreciate it.

My sincere pleasure!

Vashi Nedomansk

VashiVisuals.com

Twitter: @vashikoo

Facebook: VashiVisuals

Instagram: @vashikoo

More from Vashi at Sundance

Adobe at Sundance hosted an interesting interview series, Pillow Talk – check out the full playlist here, which Vashi stopped by for, and you can hear some interesting advice on life as an editor and his approach to working with the rest of the cast and crew. Check out Vashi’s previous work and his excellent filmmaking blog at vashivisuals.com