After Chinese hackers infiltrated a Navy subcontractor’s computer network and stole a trove of highly sensitive data on submarine warfare, it spurred the government to revise the standards that contractors must follow to ensure government data is properly protected data.

What the hackers took was “the equivalent of the stealth technology for the Air Force,” said Ron Ross, a fellow at the National Institute of Standards and Technology who focuses on computer security.

“We literally are hemorrhaging critical information about key programs,” Ross said during a fireside chat I moderated at the RSA Federal Summit Tuesday. “They’re coming after you every day. They’re either going to bring down your capability, they’re going to steal stuff from you, or they’re going to plant malicious code in your systems and they’re going to come back at some point under their timetable and bring you down.”

As for the revision of those standards, it’s currently parked in the Office of Management and Budget awaiting approval, Ross said. Ideally, the Defense Department would begin to use those standards within the next 18 months to help determine whether to award a business a contract.

But will those standards solve the problem? Here’s how Ross described the challenge during our fireside chat. An excerpt of that conversation is below.

FIFTH DOMAIN: I know the Department of Defense is working with NIST to update standards used by contractors to secure data. Will that document establish requirements and responsibilities that extend to the supply chain, considering those smaller companies are often more vulnerable?

ROSS: It doesn’t. The requirements are the requirements. But the problem you described is a real one. Information that’s critical doesn’t lose value because it goes from the federal government to a prime contractor and that value stays just as high when it goes to the sub. I think the ultimate solution is you have to protect the information no matter where it is, and somebody is going to have to pay for that. There’s no free lunch. We are always talking about what’s the [return on investment] for doing all the security stuff. We never look in the rearview mirror and say what was the cost of the cleanup? And if you remember the OPM breach not that long ago, in 2015, that cleanup I believe cost over a half a billion dollars. The cleanup is an order of magnitude more expensive than it would have taken to protect the system to start.

FIFTH DOMAIN: You talk about the need to devote money to this, and yet we’ve had programs that were awarded recently by DoD where the bids were particularly low. These were for massive platforms. It begs the question of whether those trickledown cyber protections are even being considered at the front end?

× Need a daily brief? We've got you covered. Sign up to get the top Cyber headlines in your inbox every weekday morning. Thanks for signing up. By giving us your email, you are opting in to the Daily Brief.

ROSS: I think that there’s always a question about whether we have enough money or enough people to solve this problem. I’m going to come at this from a counter view. We developed a publication two and a half or three years ago. It’s NIST 800-160. That’s a system security engineering guideline. We took an international standard, a joint standard on systems engineering that had nothing to do with security [and used that as the basis to establish] everything you need to do in a life cycle process to make sure security is integrated into that system that you’re building.

The first couple of the steps in the life cycle are called stakeholder requirements. That’s where you sit around the boardroom, or with the war fighters, and they’re saying, “what kind of a weapon system do we need to defeat the bad guy? Or what’s our business model in a Fortune 500 company?” Then you have to say, “Okay, we are totally dependent on technology to accomplish that mission. Knowing that, I’m going to build a system with a certain set of functional requirements.” Now we have a step that says you’re required to put your security requirements right in with those functional requirements and there’s something called a trade space discussion that takes place with every system. That’s where the war fighters say, I want everything in the world, and then they say, well, you got cost, schedule and performance. You can’t have that function requirement because it costs too much. You can’t build the antigravity machine. Eventually you stabilize on a set of requirements that you build to.

That's where we're running off the rails now because largely those discussions don't take place in the life cycle development. It may turn out we have plenty of money.

FIFTH DOMAIN: For years, people have criticized FISMA as being a box checking exercise. Could the expanded focus on artificial intelligence help the state of cybersecurity?

ROSS: Good AI programs, they’re just programs. They’re algorithms and those programs run on your system stack – applications, middleware, operating systems, firmware, down to the integrated circuits. So, if you’ve got a whizzbang application and you tell me it’s a trusted application, but it runs on an untrusted operating system, it’s game over. Any AI program that you’re running at the application level is totally going to be bogus information. You can’t trust it if the adversary’s already taken control of your system with a root kit.

Now, if you can build a trusted platform and take advantage of artificial intelligence, machine learning, you’ve got a great brave new world there. That’s awesome and we should be doing all of that. But you can’t hunt your way out of this problem because the attack surface is getting so large and complex and most of it’s unmanaged and most of it’s unprotected. And that’s a formula for going down in the long term.