Hurricane Sandy, which made landfall near Brigantine, New Jersey, on Oct. 29, 2012, was a showcase of modern weather forecasting — but not necessarily one enabled by American engineering.

The storm shined a spotlight on the superiority of a computer model run by a European weather center, known as the European Centre for Medium-Range Weather Forecasting (ECMWF), which, more than a week in advance, pinpointed Sandy's infamous left hook track directly into New Jersey.

Now, two years after that monstrous storm, the same computing gap remains — in some cases growing even wider. In addition, the Weather Service is trying to shore up even more basic elements of its infrastructure, like satellites and computer networks. These issues raise the question of whether the agency is ready to face another Sandy.

See also: The 8 Best Views of Category 5 Super Typhoon Vongfong

This track was unprecedented in the modern history of Atlantic hurricanes, and it wasn't until days after the ECMWF model (also known as the "Euro") showed the ominous left turn that the Americans' GFS computer model came into agreement with its European counterpart, giving forecasters greater confidence in sounding the alarm.

The accurate forecasts likely saved lives and money, as families and companies were able to take protective measures. Still, nearly 150 people died and the storm cost at least $70 billion.

After the storm, the U.S. meteorology community focused on the shortcomings in its computing power compared to the ECMWF and other international centers, and this sometimes harsh self-assessment continues.

Congress responded to Sandy with $48 million in a supplemental appropriation directed to the NWS, $25 million of which is being put toward supercomputing upgrades that should allow the Weather Service to run faster, higher resolution models. In announcing the money, the agency claimed it would soon out-super the supercomputers of its international partners.

NWS Director Louis Uccellini told Climate Central in March 2013 that the agency was making the "largest increase in computing capacity that we've ever had," advertising a jump in processing power to 1.9 petaflops by the end of fiscal year 2015. Petaflops are a measure of the number of calculations the computer can perform per second.

Yet more than a year later, in the race for more teraflops and petaflops, the NWS is still far behind other countries' weather agencies, which Uccellini admitted in an interview with Mashable. However, he said his agency is working "aggressively" to change the situation, and that it has not fallen behind schedule.

Satellite image from the Suomi NPP satellite on Oct. 29, 2012. Image: NASA/NOAA

He added that the Sandy-related funding has put the agency "on a path" toward rivaling the UK, ECMWF and other centers. According to Uccellini, the NWS is on track to have peak processing speeds of 1.5 petaflops by January of 2015, and ramp up even more after that.

“We are not sitting on the 1.5 petaflops that we’re entering 2015 with," he said, "we are very aggressively attempting to build on that.”

An event this week has fueled criticism of Uccellini's agency for being slow to improve its computing and address other major issues. On Tuesday, the UK Met Office announced the purchase of a $128 million supercomputer — from a U.S. manufacturer, no less — that it says will yield more detailed and accurate forecasts of day-to-day weather and seasonal climate predictions. The supercomputer, which will be built by Cray Inc. of Seattle, Washington, will have a processing power of 16 petaflops by 2017. According to a Cray press release, the deal is the largest supercomputer contract ever for the company outside of the United States.

In this case, the Met Office's computer is expected to be able to perform 16,000 trillion calculations per second when brought up to full capacity.

The ECMWF has also invested in a Cray supercomputing infrastructure that has "multi-petaflops" in processing speed. Its Euro model also has a superior way of ingesting the massive volume of weather data coming in from satellites, aircrafts, surface observation stations and other sources, which is a process known as data assimilation.

There is a race for each additional teraflop of computing power because processing power allows forecasters to run models at higher resolution and faster intervals, using smaller grid boxes that can capture small-scale weather phenomena such as severe thunderstorms and ground terrain features that slower, coarser models miss.

The track of Hurricane Sandy compared to all previous hurricanes in the same region. Image: UCAR via NOAA

More importantly, faster computers enable forecasters to run what are known as model ensembles, or models that are run multiple times after being fed with slightly different information, thereby allowing meteorologists to assess the uncertainty of a particular forecast.

NWS director pushes back against criticism

A PowerPoint presentation from a high-level NWS official dated Aug. 5, 2014, first reported by the Washington Post's Capital Weather Gang blog, shows that the agency won't have a similar amount of processing power to the Met Office — 2.7 petaflops — until 2018 to 2020.

However, according to Uccellini, those August figures are no longer accurate. However, he could not provide details to refute them because the NWS is in the "final stages" of contract negotiations for new processors.

One reason for the delay in securing new computing capacity to come online beyond 2015 is that NOAA had a long-term agreement for supercomputing services with IBM, but IBM recently sold its supercomputing business to Lenovo. Since Lenovo is a Chinese firm, that deal raised red flags in Congress, which put a hold on the purchase as the Treasury Department reviewed it for any national security implications.

The IBM Roadrunner supercomputer was the world's first petaflop computer. Image: Los Alamos National Laboratory

“That froze us,” Uccellini said. "They now have a plan in place to leapfrog into the next arena, and we’re still not there yet,” he said, referring to the Met Office.

In the meantime, critics are pouncing, pointing out that the NWS is not only behind the ECMWF and the Met Office, but even South Korea when it comes to weather computing.

Cliff Mass, a professor at the University of Washington and weather blogger, wrote Wednesday that the U.S. has a greater need for powerful supercomputers than smaller countries.

State-of-the-art weather prediction demands huge computer resources and thus the ability to forecast well depends on access to the top supercomputers in the world. Some numerical weather prediction models are run globally at moderate resolution, while others are run at ultra high resolution over smaller domains to predict small-scale features such as severe thunderstorms. Thus, a large nation, like the U.S., requires far more computer power than, say, South Korea or the United Kingdom.

According to Mass, the new Met Office supercomputer will have 20 times the computing power of the U.S.' best comparable system.

"The U.S. atmospheric sciences community is the intellectual leader in meteorology and weather prediction and many of our research advances are applied overseas, such as at ECMWF and the UKMET office," he wrote. "The American people deserve to take advantage of the research they are paying for, but that can't happen with inferior computers and inferior forecasts."

NWS spokesman Chris Vaccaro said Mass' argument doesn't hold up when you consider the agency's recent actions and its plans for the next few years. “There is this notion of falling behind that is just not the case,” he said.

The NWS has upgraded some of its computer models since Sandy, including a hurricane forecasting model and a short-term forecasting model. In addition, the agency plans to run its GFS model at higher resolution out to 10-days, rather than the current five days, and improve the ways in which weather data flows into the models.

However, none of these steps will put the agency ahead of other weather forecasting centers worldwide. Instead, the agency is playing catch-up to reach parity with its international partners.

All this could be forgiven, though, if the Weather Service's woes weren't much deeper than the uber-geeky world of supercomputing.

Satellites and Sandy

Last week, data from at least a half dozen satellite instruments stopped streaming in to the agency's computer systems, depriving computer models of much-needed information and potentially making them less accurate all the way into the start of this week.

At the time, the agency maintained that the models, including the GFS and the ECMWF, also known as the Euro, still had plenty of data from other sources to use in order to project future weather conditions, and that forecast accuracy was not adversely affected.

Damage from Hurricane Sandy. Image: Mark Lennihan/Associated Press

But Uccellini said on Thursday that, in fact, the data losses did affect the accuracy of various computer models, including the GFS and the Euro, mainly because the data outage, which was caused by scheduled maintenance, lasted longer than planned.

“It turned out to be a longer outage than planned," he said. "It did have an impact on the model forecasts and we’re seeing it in the verification scores.”

Marshall Shepherd, a former president of the American Meteorological Society and professor at the University of Georgia, told Mashable that had this satellite outage happened the same week two years ago, when Sandy was making its run at the East Coast, life-saving forecasts may have been at least somewhat less accurate.

“I don’t know exactly where we would’ve been if this had happened two years ago,” he said, referring to weather forecasters in general. However, he said the errors introduced probably would not have been so severe that they would have caused forecasters to miss the storm's threat entirely, because of the data that was still streaming in.

John Leslie, spokesman for NOAA's National Environmental Satellite Data and Information Service (NESDIS), which is the agency's satellite division, said those outages and a separate problem involving snow and ice data on the National Ice Center's website probably didn't affect forecasts much either.

"We can't speculate that scenario, however, fundamentally, the recent outage did not degrade weather forecasts," he wrote in an email on Wednesday.

Ryan Maue, a meteorologist with the private weather firm WeatherBell Analytics, says a partial loss of satellite data, such as what occurred in the past week, during the run-up to Sandy almost certainly would have affected forecast accuracy, but it's unclear how significant the errors would have been.

"It would have affected it for sure," he said. "But we don't know how badly."

He said in a Twitter message exchange that an extended satellite outage would have prompted the models, including the GFS and the ECMWF, to adapt by giving more weight to other datasets that were still functioning.

The satellite data issues have been distinct from other technical snafus the agency has suffered since Sandy hit, including a false and massive flood warning that was issued for the entire eastern U.S. in April, the loss of access to weather.gov, the agency's home web portal, due to a poorly designed Android app.

Uccellini readily admits that the agency's warning and forecast dissemination system "is broken."

“We are very aggressively trying to replace our dissemination system,” he told Mashable in an interview. “I’m not walking away or trying to hide.”

As if the computing and dissemination system issues weren't enough for the agency to deal with already, the Weather Service is likely to experience a gap in satellite coverage for a year or more, starting in 2016, due to a break between a current polar-orbiting satellite's design lifetime and its replacement's launch date. That gap alone is likely to degrade forecast accuracy significantly, according to NWS' parent agency, the National Oceanic and Atmospheric Administration (NOAA) and reports from the Government Accountability Office.

So the NWS is facing challenges on three fronts: supercomputing, the communication of its routine forecasts and warnings, and satellite gaps. That's before accounting for the budget cuts imposed by Congress under the so-called "sequester," that slashed spending government-wide.

In confronting them, it will be aided by its cadre of meteorological professionals, who have access to the UK and ECMWF model data. But they can only do so much if they are increasingly hamstrung by lacking key data and forecasting tools.

The stakes couldn't be higher. A 2011 study found that routine weather variability alone costs the economy $485 billion each year, without accounting for the billions in damage from major events, such as Hurricane Sandy.

With this in mind, one has to wonder whether the agency could forecast another Hurricane Sandy today, or a decade from now, and disseminate those warnings effectively. Considering that climate change is projected to result in more extreme weather and climate events, that's not a comforting thought.