Editing the Tour de France on Final Cut Pro X

It's now no secret that ITV4’s 2014 highlights coverage of the Tour de France was made by a team of editors working on Final Cut Pro X.

A production of this scale and quick turnaround had never been tried before on FCPX. Four edit suites with new Mac Pros connected to a 70 TB Xsan. All producing a highlights show on air on ITV4 at 7pm every day for 21 days. And with the Tour starting in the UK in Yorkshire and our output also being live on ITV1 on Saturday and Sunday afternoon to millions of viewers, the pressure was certainly on.

So enjoy our biggest FCPX user story of the year so far!

How did all it work out? Well, you’ll have to read the whole article to find out, but let's say we broke new ground, FCPX performed well and we won't be going back.

So where to start? At the beginning? Well we actually need to go back bit further. Back to 2005 when the decision to produce the coverage of the Tour on FCP 4.5 was taken. In some ways this was a larger leap, as we had traditionally made the shows editing on to tape from a virtually controlled fileserver, the Grass Valley Profile.

The Tour team were the first to develop the FCP method with a growing file workflow which has now become standard practice for many sporting and live event productions.

Back then, many people said we were mad and that what we were trying to do couldn't be done. We proved them all wrong. So when the transition from FCP7 to FCPX was talked about, we were used to people waving their arms about saying that it wouldn't work. We had many déjà vu moments.

This article features comments from four different writers. Myself, Peter Wiggins the Senior Editor (this was to be my 19th time editing the Tour), James Venner, the Producer who took the decision to go to Final Cut Pro 7 and then again to Final Cut Pro X, Tony Davies, an Assistant Producer who worked with me in the suite and also Sylvain Swimer from TimelineTV, the company that provided the facilities.

Senior Editor Peter Wiggins with Assistant Producer Tony Davies

Let's start off with a bit of history from James, he asks, why FCPX?

"We had been editing our Tour de France programmes on FCP7 since 2005. By the time that Apple launched FCPX in 2011 the software was already starting to show it's age. We had moved to HD production the year before and whilst the system still worked we had experienced some crashes and general instabilities.

When we first saw FCPX at a demo, like many we were perplexed. The software had some clever features to be sure, but much of what we thought we needed seemed not to be present. In particular that first version had no support for working on a SAN or for collaborative editing and no apparent way of dealing with a separate clean effects track. Until these things were solved we couldn't even consider FCPX for our workflow. On top of that Apple’s radical re imagining of the editing interface would mean a difficult transition for our editors. We came away from that demo saying that it would be at least 3 years before the software was likely to be ready for us to use.

And so we stuck resolutely to FCP7. It still worked albeit a bit creakily, but we knew it's foibles and limitations and had an established workflow. We waited nervously for the OS update that might break FCP7, but largely we continued on in our comfortable rut making the shows the way we knew how.

Throughout the time since its launch we would give occasional glances at FCPX to see how it was progressing. Update followed update but our list of show stopping issues remained largely untouched. Apple said it would work on a SAN but only with each edit working within its own sandboxed area. No collaboration. Still not for us.

70TB of Xsan powered by Promise drive chassis.

Then in October last year we were asked to do a job which entailed uploading clips from an Outside Broadcast to the clients YouTube page. The clips needed to be online within 10 minutes. FCPX with it's ability publish to YouTube from the timeline seemed like the tool for the job and so we set about exploring that option. Could we stream from an EVS and edit for the stream? Well sort of. Growing files didn't work but closed files did, although, because of the quirky way EVS recorded QuickTime, it couldn’t properly display timecode. This wasn't a show stopper for that job, but would obviously be a barrier to using FCPX more widely.

With the release of 10.1, we began to feel that the last of our major obstacles had been addressed. Of course there were bound to be more problems on the way but I reasoned that we'd only find those and solutions to them if we were actually using the software.. If we sat and waited for a complete problem free solution we’d be waiting for ever.

I was starting to get concerned about how long we could continue to edit in FCP7. By now it had had no support for three years and very little in the way of updates for 2 years before that. At some point it was going to break, already we were seeing slowdowns and hangs. Yes we could probably hold out another year but that would be all. Time to look for a replacement.

There were broadly 3 options for us:

1. Avid. Still widely used in the broadcast industry. Many swear by it, others swear at it. Count me in the latter camp. I've just never got on with it. I cannot remember a single job on Avid when there hasn't been a problem. None of my editors were very keen either. It also felt like an ageing system that is in need of a rewrite. No support beyond 1920 x 1080, this didn't seem like an option that was going to give me a modern editing platform for the next 5 plus years.

2. Adobe Premiere. On the face of it there was much to recommend about Premiere. It looked a lot like FCP7, so the learning curve for the editors would be shallow. It has good integration with EVS and with After Effects. And yet I sensed a general lack of enthusiasm when I spoke some to editors and engineers. Yes it does a job, yes it would do the job for me but I couldn't find anyone who really loved it.

Also it's an ageing platform with a lot of old code. How long before it too needed a ground up rewrite? It also didn't seem to move us much further along. It's like FCP7 but with 64 bit code. A bit faster, a bit more stable, but a bit of a mezzanine step. I felt that we'd be looking to move on again in a couple of years time.

3. FCPX. So that left Final Cut Pro X or “iMovie Pro” as some people called it. We decided to try it on another project. This was just for stand alone edits, but it taught us a lot about the interface and what we found we liked. It was fast; easy to find material; we felt much more in touch with our rushes. It also has a great ecosystem with many plug ins being developed. Could we make it work in a shared environment?

The four edit suites are in the old film cutting rooms at Ealing Studios.

There were some big hurdles to cross, the learning curve would be steep for the editors, EVS would have to be removed from the record path because they showed no inclination to make their files compatible. On the plus side we'd be doing something new, I wanted an edit system that made us re-examine our workflow; rethink why and how we did things and hopefully inject some new creativity. I wanted something that would grow with us over several years.

I didn't want a system that just let us keep doing the same old thing. Time to roll the dice."

So the decision had been made, the next step was to work out what equipment we needed. We had edited the Tour at Timeline for the previous two years and knew their system well.

The main problem was one of connectivity, all their previous generation Mac Pros had been connected to XSAN via fibre and an internal PCIE HBA board. We intended to equip each of the four edit suites with super-fast new Mac Pros, and of course these connect to the outside world with Thunderbolt 2.

Each suite would have:

One Mac Pro

Two Apple 27” monitors

One Promise SanLink 2 (Thunderbolt to Fibre Channel)

One Promise Pegasus2 R6 RAID

One AJA Io 4K

We searched online for information on the best practice on what should be plugged into which port. After a bit of head scratching and testing, we came up with a plan on how to wire up each new Mac Pro into the suites. It might seem odd that the local and shared storage is connected via the monitors, but there’s more than enough bandwidth in Thunderbolt 2 to drive the displays and pass the data.



This frees up the third Thunderbolt port for the AJA Io 4K to use on its own. I was really impressed with the device, coming from editing on FCP7 with EVS files via a Kona 3 where lip-sync could be +- 2 frames out when you hit the play button. With FCPX, the Io 4K was rock solid. Not only did it never lose lip sync, it also responded instantly matching the viewer when skimming over clips. We actually forget we were using it and left the FCPX broadcast output on all the time whilst editing. The outputs of the genlocked Io 4K were also fed into the machine room so that they could appear on the vision mixer or be recorded into the EVS.

The AJA Io 4K

The Promise SanLink 2s were another piece of kit that performed exactly as you'd expect. We chose to put dual fibres in as we had noticed some speed gains by this method in testing. FCPX can demand a lot of data for certain functions such as drawing waveforms as these are made up of large numbers of very small packets of data. We were keen to tweak our setup for maximum performance.

Promise SanLink2 Thunderbolt 2 to Fibre Channel.

Although most of the data and video files were stored on Xsan, we had the Promise Pegasus2 Thunderbolt 2 RAIDs connected for local storage. It would have been very easy to fill up the MacPro’s internal storage with exported programme parts and elements. The drive also gave us a level of redundancy, being able to edit should the Xsan go down, which thankfully didn’t happen, although we got very, very close one day.

A Promise Pegasus2 R6 for very fast local storage.

With two Apple 27” monitors, it made sense to show the events on the second display. That meant that we had a whole screen to stretch the filmstrips across.

So with the edit suites all sorted, Time to look at the actual workflow.

We have a team of about 15 in France who have an OB truck based at the finish of every day’s stage. They have two mobile cameras that shoot interviews, pieces to camera etc and also a three camera studio set. The daily one hour highlights show is made up of race coverage from French television in the form of a multilateral satellite feed combined with the extra material sent from our team in France on a unilateral satellite feed.

France also sends us voice-over and the last two components of the program are music from our large library that is all stored on Xsan and of course graphics that came from a dedicated team working with a Viz (To match ITV’s corporate style) and After Effects. Full frame graphics were recorded in ProRes 422, anything that has transparency got rendered out in ProRes 4444.

This year as James hinted earlier, we could not use an EVS to record the multilateral and unilateral feeds to stream to the Xsan.

We believe that this is because EVS hasn’t yet written the integration of AVFoundation components in a QuickTime file. It also explains why EVS streams don't work with QuickLook. As somebody once said, editing without timecode is like cutting butter with a cricket bat and as EVS showed little interest in re-writing their streams to proper QuickTime specifications, we had to look elsewhere for a solution.

I’ll let Sylvain take up the technical side of the show:

"MovieRecorder from Softron was already released and it proved to be a very capable EVS replacement in the record chain, it provided the ingest facility for the entire project. It fully supports AVFoundation and therefore was perfect for FCPX in recording a growing file that can be accessed within seconds.

One new Mac Pro was able to record multiple streams of video in real-time while a second worked in parallel as a back-up recorder. MovieRecorder is designed with a focus on edit-while-ingest for FCPX and performed admirably.

Softron's MovieRecorder - One channel recording the Unilateral, the other the Multilateral satellite feed.

We used AJA's Io 4K hardware for capture and all footage was streamed directly to the 70TB XSan in Apple ProRes 422. In a large improvement over legacy workflows, MovieRecorder's 'edit-while-ingest' facility made the footage available to the edits within five seconds of capture so everyone could access the footage in practically real-time. We must thank Pierre Chevalier from Softron for not only providing excellent product support, but for also adding a few tweaks to the program which helped us a lot.

For ingest with MovieRecorder - Two Mac Pros, two AJA Io 4ks, two Promise SanLink2s and Promise M4.



Voice-over media was recorded in France and sent back to Ealing via VSquared TV's FTP server. The simplified design of the new Compressor software made converting the files a breeze, thanks to easy-access presets and favourite destinations.

FCPX was able to fit neatly into the traditional 'Send To EVS' workflow that we've relied on for past Tours. We export QuickTime movies in native ProRes 422 which get picked up by the EVS XTAccess server and sent directly to the EVS XT for immediate play-out. In the past we'd run into difficulties with FCP7 dropping progressive frames into interlaced exports that would force us to re-render footage and re-export, costing valuable time with a TX deadline looming.

With FCPX this problem is no more; even when working with progressive sources on an interlaced timeline, all media was exported as 1080i with background renders happening so fast on the new Mac hardware, you could barely realise there was any rendering required. The system worked faster than ever, with exported parts ready for TX a couple of minutes after editing.

Sylvain at work setting off a recording on MovieRecorder.

To enable a collaborative workflow, we developed the concept of the Master Library. The Master Library contained all the archive footage for the Tour and a folder structure to aid editing. All four suites took this Master Library as their ‘workspace’ for the day.

At the end of the edit day, this renamed Library then had all of its cache files deleted before being zipped up and put back onto the SAN. All media was external, but as we had a lot of archive clips, each daily library came in at just under 2 Gigs. The editors would also hand delete any optical flow files which seemed to bloat the file.

The Master Library would then be updated to include all the footage recorded for that stage, and the next morning the process would begin anew. This allowed us to manage the Libraries and keep track of edits easily, whilst giving editors a head-start thanks to the pre-built archive. All the media contained was fully keyworded, so finding footage was easy and simple whether it was a specific stage of a previous tour, or a simple graphic overlay."

So onto the editing which was my department!

The Libraries from Edit 1 on the desktop, the ZIPs get uploaded to the San.

Using Sylvains Master Library, It took a while to realise that you only had to import new footage during that day, all the previous feeds, voice and graphics had been combined every night into the new updated master library. This was a hangover from the FCP7 days! It was a good feeling to know that you had everything from the production so far that could then be skimmed instantly to find a shot or item.

We also had our entire archive library stored on the SAN and keyworded in an event. We had over 150 hours of past races right the way back to very old grainy black and white film coverage. We had decided to keep the last three years of race multilateral feeds so we also had access to every pedal turn of racing from 2011, 2012 and 2013.

The master library with all the archive media, this years race media and all projects, took two seconds to open on the new Mac Pro. This was probably the one thing that caused the most eyebrow raises when visitors toured the suites. Everything online and just a skim away, all within two seconds of opening the Library. Many thanks here to Alex Snelling for setting this up before the race started.

Over 150 hours of archive in the Library, all ready for skimming. (click for bigger picture and check the amount of media)

It took a few days to get into it, but the editing just got faster and faster. The magnetic timeline proved its worth as we could snip away at race footage knowing that we would never lose sync or leave a section or clip at at the end of the timeline. The reverse was also true, inserting a clip into the most densely edited part of the timeline was done in seconds with everything else rippling out of the way. Rearranging shots in a montage or shortening one down was achieved in seconds.

I ended up having a contest with myself to see how much I could edit without stopping the timeline playing. Applying a name super as a connected clip was easy, extending music under voice-over didn't stop the cursor either. The whole timeline felt very ‘Live.’ Over three weeks, through trial and error, I worked out when it was best to use connected clips or secondary storylines, to expand the audio or to keep it closed and which method of trimming was the best or quickest. I really felt that it could edit as fast as I could work it.

I also forgot about rendering having set the background render to start after 10 seconds. This meant it didn't get in the way of editing, but it would kick in if I stopped for any reason such as a runner asking what I’d like for lunch!

I’ve many times compared FCPX to a Formula 1 racing car and I think the analogy extrapolates well to what we were doing on the Tour. You couldn’t really buy faster FCPX edit suites than we had, but you had to know how they worked otherwise it was easy to spin off the track.

The example to back this up was when we noticed that FCPX was slowing down to the point where it actually became difficult to edit. After a lot of tests, we traced it down to the Mac Pro and FCPX generating waveforms in the background. If you import a six hour clip, FCPX will quite happily sit there analysing the audio to build the waveform. You can cancel one of the processes in the Background Tasks window, but there is another process behind that that will only stop if you restart FCPX.

The moral of the story is to not import an hours-long clip with the Inspector open as this will trigger off the waveform generation and slow things down when working on a SAN. Also make sure the waveform display on clips in the browser is turned off and try to avoid opening up the audio components in the Inspector as that will slow things down too. Again, this was only necessary because we had such incredibly long clips.

Collaboration is one of the things that FCPX gets criticised on and as we often share cut items between suites, we had to work out a way to achieve this without the duplication of media. A few tests with exporting and importing XML proved that speed changes and music edits in secondary storylines both suffered in the process. It turns out that XML is primarily designed to work with third-party applications and is purposefully not a lossless roundtrip because of that design. We finalised on saving a new “transfer” Library with the relevant project as the only item in it. This was then compressed and stored on the SAN in a clearly labeled folder for another suite to pickup.

I started off the Tour thinking that FCPX desperately needed audio only crossfades. It soon became clear that using the integrated fade handles to ramp the audio proved to be much more flexible as shots could be moved around at will without losing an attached crossfade. The application of the handles can be a bit fiddly and an ‘add handles’ keyboard shortcut would be appreciated.

A secondary storyline showing the audio fade handles.

I would still however like to have audio automation on clips. Nothing beats the ear/brain/hand feedback loop for speed and accuracy. Also on the audio side, we received the race footage as split tracks, clean commentary on one and two and clean effects on three and four. This meant that I could apply the excellent AUDynamics Processor to the commentary to compress the levels for broadcast. I much prefer this compressor over the built-in FCPX version as you can actually draw the compression ‘knee’ in real-time. Good news for Adobe fans (see not all FCPX news here!) as this is part of the OS, you can access the same plug-in in Premiere if you're on a Mac.

A couple of other mentions for plugins here. On a couple of days, some interviews were fed from France that had an audio fault with very bad crackling making them untransmittable. iZotope’s RX3 gave us a large paddle in the creek by removing almost all of the problems with their decrackle & declicker filters. Slightly embarrassed to say that we got away with the 10 day trial too, however if it happened again, the $300 price of the plugin is a lot cheaper than a reshoot and the satellite time to get it back to Ealing. Even our sound man Nick who mixes the live show came into the suite to say well done for cleaning the interviews up. He also suggested that R128 metering would be a cool addition to FCPX.

We also heavily used XEffects Toolkit plugin for broadcast countdown clocks, widescreen mattes, highlighters and a few other tools that made life easier. The subtitler in the pack was used on nearly every programme.

Sitting in the suite working FCPX is one thing, how did the production staff feel about the new technology? Tony Davies reflects on how FCPX affects the producer.

"It’s unreasonable to expect that there will be no teething problems with using a new editing system but I was rather fortunate to miss the most difficult period of the first three days as I was on the road with the crew for the three Stages that wound their way down from Yorkshire to London. Notwithstanding the fact that I had watched three seamless highlights shows on each of those evenings, I was expecting to arrive at Ealing Studios and feel like one of those new Marines marching in to find a lot of bloodied combat veterans with thousand yard stares.

At the end of my first day in the edit I was only half joking when I said “ I don’t know what all the fuss was about. Piece of cake”.

Of course it wasn’t a piece of cake for those charged with making the technical integration work or, indeed, for editors having to use a new and unique NLE in a live broadcast environment. So first of all I have to say hats off to the collaborative and supportive effort that made all that work. As an aside, I have half-heartedly dabbled with FCPX as an edit tool myself on a couple of occasions and have yet to get to grips with it, but our consummate professional editors did it remarkably well.

Clive Nicholson editing the highlights show.

In any case, my perspective on the Tour de France is entirely on how FCPX worked as a production tool. When it comes down to it all NLE put pictures and sound together in whatever way you tell them to do it. I have never subscribed to the notion that you can’t edit stuff on any of the major NLE systems. If you couldn’t, nobody would use them at all.

I have to say that FCPX had a whole lot of production features I really liked. We live or die by our logging system. If we can’t find the right shots from our live stream or from archive we can’t edit with them. But FCPX brought a whole new benefit: as well as being able to look at the traditional FileMaker Pro logs and call out time-codes we could view the streams as filmstrips. This sounds like a minor thing but when we were scrambling to find jersey presentations, replays, interviews etc it ended up being quicker to just point at the appropriate thumbnails and grab the shots we needed.

Using filmstrip mode it was easy to find shots.

With an Italian leader and French jersey wearers for much of the Tour our interviews were dominated by non-English speakers, necessitating the dreaded, time-consuming sub-titling process. Using an FX Factory plug-in made this job a piece of cake (apart from the unavoidable part where you have to read every word like a 10 year old to ensure there are no typing errors in the original translations).

And, you know what, I can’t think of much else. But that’s not a bad thing because I was expecting to be listing a whole load of things I didn’t like (my conservative nature again). The editing was as quick as usual, the transfers to EVS certainly quicker and my days were no more or less stressful than before. FCPX doesn’t need puff pieces any more than it needs savaging. In the hands of our editors it did the job in a broadcast environment without adding to my grey hairs.

I think what does need to be recognised is that it is likely to become a better production tool as time goes by. I am looking forward to the day when the logging system is seamlessly incorporated into the FCPX database function and when the import functions mimic the Finder layout options rather than being a single column. But all that will come in time. For the moment I am happy that I can head to Ealing next year and worry about what to have for lunch rather than our edit system."



I think Tony summed it up rather neatly. We have made the change to a new NLE and we can see the whole system just getting better and better every year.

For me as an editor, I had fun again. The hardware and the software got out of the way so that I could concentrate on what the viewers saw onscreen. Instead of trying to find a quick way of achieving the look of a complicated effect, we did the complicated effect or more and added a lot of polish to the show.

I could respond to the AP asking ‘we need the shot of this’ by just skimming to the clip (maybe after a keyword click) and adding it to the timeline. No more ‘it will do’ - we used the best stuff we could find and regularly went back to replace shots if better examples came in on the feed.

I think once a sports or reality producer has used FCPX and has seen how easy it is to find a specific shot, going back to other NLEs will all seem a bit clunky and slow. Tony and I ended up having races to see who could find a shot first. Most often, by the time Tony had found timecode, I had already edited the clip on to the timeline. Combining a live logging system that publishes into an updatable event in FCPX would be absolute heaven. Philip Hodgetts and Pierre from Softron should talk!

The gallery on air with the live show.

The key to editing a sports programme that is aired on the same day is knowing what you can do in the time available. With a fixed TX time of 1900 every night, your part (or parts!) have to be there, you cannot transmit black. You haven’t got the luxury of coming back the next day and having another crack at solving a problem. Not many editors can work under such stress. Very often we would be editing the end of part four whilst part three was on air. On one day we were actually editing the end of part four whilst the beginning of part 4 was on air! You have one chance to get it right and for that you need speed and confidence in your edit system.

FCPX provided both.



Problems? Two

On the first day we had SAN speed issues. As FCPX was the new kid on the block, fingers were pointed. When a file was exported to the EVS, it caused the EVS to crash in such a spectacular way that our operator, Rocksteady Ron Bradley said he had never seen such a crash in the years he has been working it. We eventually traced the problem to a faulty EVS XTAccess box that not only caused the crash but as a result of it falling over, caused the SAN to slow down as well.

The faulty box was swopped out for a new machine and after that we didn’t have a single export from FCPX fail when pushed to the EVS and then played on air. We are now looking at other play out solutions (MacPro based) as it no longer records the feeds. Although EVS is an industry standard, it takes up a large chunk of a production budget to hire.

The second problem was the waveform analysis and generation on long files causing a slowdown in FCPX. Once the workaround was discovered, this wasn’t really an issue anymore, but slightly annoying when you forgot.



Conclusion: We all love top tens, so how about summing up our experience with FCPX into bullet points:



1) Fast. Fast to import, fast to edit, fast to export.

2) Fast exports to EVS for transmission in minutes without a single fail (ever since XTAccess swop!)

3) Graphical logging for AP’s - find shots quickly on the filmstrips.

4) Magnetic Timeline - add and subtract shots without moving blocks of clips, trimming to make space, or fuss.

5) Quick render speeds to the point of forgetting about having to render.

6) Ability to have all the media from this years race, all race archive & previous race multilaterals in the Library and all instantly skimmable

7) Complex graphic builds and layers not a problem (Bring them on, we have the power!)

8) Better use of speed changes with Blade speed and Optical Flow. Very handy for putting holds for timing in rendered graphic files

9) The master library from item 6 opened in about 2 seconds.

10) It was fun.

Just a few thank yous to get in. First of all to James and Carolyn at VSquared who had the confidence to go ahead and trailblaze with the new technology. Charlie Tear and Sylvain Swimer at TimelineTV in Ealing who showed enormous enthusiasm in the run up, testing and the days that we were on air. Alex Snelling who helped out with the workflow and Pierre at Softron for his support, even the morning after Belgium’s defeat of the USA in the World Cup!

©Peter Wiggins / FCP.co