Peter van Egmond

Jonathan Gitlin





Elle Cayabyab Gitlin

Jonathan Gitlin

Jonathan Gitlin

AUSTIN, Texas—As you might imagine, speed is extremely important in Formula 1. It's not just the racing cars that have to be fast, though. Data from thousands of sensors on each of the cars has to be piped from the track to each team's headquarters (usually in the UK) and then back to the track again, as near to real-time as possible. The same goes for gigabytes of HD video since this is predominantly a televised sport. In order to get a better idea of how that all happens, we wanted to be on the scenes. So even though a visit to some far-off (and dry) data centers may have done the trick, we were on hand at this year's soggy United States Grand Prix to witness data at F1 speed.

The Circuit of the Americas, also known as COTA, is a state of the art race track just a few miles east of Austin-Bergstrom International Airport. But for the Grand Prix, we returned to a somewhat different looking COTA than the one we left a month earlier after the Lone Star Le Mans . (In fact, the F1 circus was in town to displace the actual circus—Cirque de Soleil—which had taken up residence in the main parking lot.)

To start, the farthest-reaching arms of Hurricane Patricia paid Austin a visit this weekend, swapping the usual Texan heat and humidity for three months' worth of average rainfall in a single weekend. Also gone was the freewheeling fun attitude of the Le Mans sports car race, where tickets are cheap and fan access plentiful. Instead, everything was tightly controlled. F1 is a much more elitist sport, and woe betide you if you had the wrong lanyard or pass and tried to gain entry to the paddock...

Thankfully so equipped, we were able to pass into the sport's inner sanctum for a rare look at F1's tech center. From the outside it was a relatively anonymous temporary building, one of many that travel with the teams and organizers to the so-called flyaway races—events that take place outside of F1's traditional but endangered heartland of Europe, where the cars and equipment are trucked from track to base and back again. Instead, for the races in the Americas, Asia, and beyond, everything gets loaded into freight 747s. In luggage terms, the tech center alone is responsible for a single jumbo—130 tonnes of servers, networking hardware, cabling, and so on. Much of it is built into frames the size of standard air freight pallets for easy assembly.

Alastair Staley/LAT Photographic





Inside the tent

We were shown around the tech center by John Morrison, CTO for Formula One Management, the company that controls the broadcasting and promotional rights of F1. FOM is based at Biggin Hill, a former World War II RAF base and now private airport outside London. Even on race day in Austin, its offices will be bustling with staff working with data piped from the track. Onsite at the track were roughly 200 additional staff who arrived several days before to set up the tech center and lay 12.4 miles (20km) of fiber around the place. That expansive and traveling infrastructure is necessary to support the TV cameras, microphones, cellular data networks, and everything else that motorsport's most expensive show expects to find.

Morrison led us through the first half of the tech center, an area filled with walled cubicles providing office space for his team, and into the back half of the cavernous tent. Here, bay after bay of rack-mounted equipment sat happily humming away in the air-conditioned environment.

This is where the timing for all the cars on track is calculated, a complicated and extremely important job in a sport that's timed down to the millisecond. This aspect is particularly vital given the vast sums of money at stake for the teams based on their results across the season. As an example, back-of-the-grid team Manor scored just two points in 2014, courtesy of Jules BIanchi's ninth place finish at that year's Monaco race. Those two points meant Manor finished 2014 in 10th place out of 11 teams, but this translated into as much as $40 million in prize money at the end of the year. The result saved Manor from bankruptcy, a fate unavoidable for Caterham F1, the team they beat. [Correction—Caterham actually finished 9th and beat both Manor and Sauber.]

[UPDATE: After this article appeared, Tata Communications contacted us to say certain details originally published here fell under the terms of our nondisclosure agreement with F1 management. We have removed several sentences describing the layout and staffing of the tech center.]

The direction of the TV broadcast is done on-site, but all the data is piped back to the UK before being distributed around the world. Broadcasters get access to what's called the world feed, but they can also get raw onboard and other feeds as well, depending on how much they pay FOM for the privilege. That distribution is handled from Biggin Hill, which is where Tata Communications comes in. The Indian firm has been F1's connectivity provider since 2012, handling the data connectivity from each race on the calendar back to FOM's home base. Tata also handles content delivery for online platforms (F1's mobile apps and website) and provides data connectivity for several F1 teams, including Mercedes and McLaren-Honda.

Tata sends about 13 people to each race, and these employees turn up the week before to start getting everything set up. They set up a 1Gbps circuit at each track, an order of magnitude more bandwidth than just three years ago when the technical partnership with F1 started. That's throttled back to 300Mbps, traveling on Tata's own fiber from each circuit back to the UK (from Austin the data was traveling to Dallas with a backup to Chicago on a Time Warner Cable Business Class fiber).

It's all critical data, and this soon becomes a 24-hour-a-day job given the global nature of the sport. Tata's service manager, Gary Crocombe, and data engineer, Rob Hamilton, relayed to us the dread of those 4am phone calls (and the relief when they turn out to be well-meaning family members who forgot about time differences). Crocombe and Hamilton also showed us the real-time metrics of F1's online content delivery down to the state-level here in the US (although Crocombe lamented that in the UK they aren't even able to differentiate Wales from England).

Listing image by Peter van Egmond