Artificial intelligence isn’t sexy. I would know. In 2017, I led data creation and quality control for xView, one of the largest open-source overhead satellite imagery datasets in the world. I spent hours each day squinting at satellite imagery, looking over thousands of objects and trying to distinguish minute pixelated differences between a bulldozer and a tractor.

Data is the lifeblood of AI, and a crucial part of the infrastructure that will need to undergird these new technologies as the U.S. military adopts them. Right now, however, the defense community is focusing too much on how AI could fundamentally change warfighting and not enough on the less sexy — but much more important — infrastructural, organizational, and cultural changes that will need to be put in place before AI can have this effect.

The Department of Defense recognizes the importance of AI, from the secretary to the rank-and-file. Recognition is the first step in making AI militarily relevant, but hardly the last. AI is both a revolutionary and enabling technology capable of improving Defense Department missions from intelligence gathering to predictive maintenance, supply chain management, cybersecurity, and risk management. But as an enabling technology like electricity or the internal combustion engine, rather than a stand-alone weapon, AI must be integrated into the fabric of how the Department of Defense operates, rather than siloed into a few large Manhattan Project-like programs of record.

Thinking about use cases for artificial intelligence is important, but right now the focus should be on the less-than-sexy acquisition process, organizational structure, and digital infrastructure needed to incorporate AI into programs of record, or those that validate, field, and sustain a capability over their lifecycle. The Defense Department knows it needs to innovate and knows how to do it, but as the 2018 National Defense Strategy notes, it must also “organize for innovation.”

Five Links on the Digital Value Chain

AI might seem sexy. The process of getting to an AI-centric military is not. To do this, the Defense Department needs to build up the necessary infrastructure, sometimes known as the “digital value chain.” The first link in the chain is large, labeled datasets used to teach machine learning models what to look for. Second, the department needs a cloud environment for data storage and the required computational power for training algorithmic models to learn from that labeled data. The third step is a development and operations environment in which software developers work side by side with the operations team that manages the deployment of algorithms to continuously spin out small bits of code rather than large chunks all at once. Fourth, small cross-functional teams of AI experts, end users (warfighters), user experience/user interface designers, and systems integrators from the defense industrial base should work together to perform “sprints” — short periods where specific work is completed and reviewed before progressing to the next iteration, or sprint. A good example of this was Project Maven, which used computer vision models to detect and identify objects captured in full motion video from drones. The fifth link is a culture across the Department of Defense that prioritizes prototypes and fielding capabilities rather than just research in labs.

1) Labeled datasets

Most models today require massive amounts of labeled data (i.e. an image is labeled by a human, with the “ground truth” that the machine “learns” to recognize so when it sees similar data it can predict correctly). A good model with more high-quality data can outperform a superior model without the same quality or quantity of data. In a War on the Rocks article last month, Connor McLemore and Hans Lauzen made the point that data labeling is costly and time-intensive. I concur — I know this firsthand.

However, instead of viewing data labeling as a challenge, planners should reframe it as an opportunity. The Defense Department already has immense amounts of data, and invests hundreds of thousands of dollars and many years to train data labelers — enlisted technicians with systems expertise and domain knowledge. Navy sonar technicians already have to scour incoming acoustic signatures to gauge the difference between a Russian submarine and a whale. Air Force Imagery Analysts differentiate civilian vehicles from armored personnel carriers in Eastern Europe. This is a form of data labeling. The problem is not that the department cannot label data; it is that there are few, if any, interfaces designed to capture this data labeling in a way that can be used to train artificial intelligence.

2) Cloud environment for storage and model training

All the labeled data in the world is useless without somewhere to keep it where it is both accessible and safe. There are nearly 600 AI projects across the department , but few of these projects have become programs of record because of stovepiping and a focus on early stage research rather than prototyping. AI work in government labs is critical and complements the rapid progress being made in the commercial sector. The difference is that commercial companies operationalize and deploy their research. Eric Schmidt, former chairman of Google and Alphabet, told Congress in April, “ Any military that fails to pursue enterprise‑wide cloud computing isn’t serious about winning future conflicts. ”

The Pentagon might follow the example of the intelligence community in pursuing a cloud solution to allow its various agencies to share and use outside data streams and build upon one another’s research — rather than each organization repeating basic research and development. This solution should be acquired from a reputable AI or technology company rather than from a traditional defense contractor. In FY 2017, a significant amount of the $7.4 billion that the Defense Department spent on AI, big data, and cloud computing went to contractors, with minimal resulting AI infrastructure or programs of record. For perspective, that’s more than artificial intelligence leaders Google, Amazon, Apple, Intel, and Microsoft have spent combined to acquire AI startups since 2006.

3) Agile practices and a software development environment

Training artificial intelligence models requires a place to store data and access to computing power to teach the model using that data. The cloud computing environment provides both. But just as importantly, it enables an agile development process and culture needed to create and deploy user-centered products quickly. Without such a process, the department will be unable to find and fix flaws throughout the development of the program or adapt to changing needs of the users. The Department’s Joint Improvised-threat Defeat Organization and the Air Force’s Project Kessel Run have both successfully implemented a cloud environment that supports rapid, continuous delivery of software through active, ongoing collaboration between software developers and operations staff. This culture of agile software development, deployment, and maintenance should be replicated in other defense organizations and programs of record to allow more teams to use AI to address their problems.

It’s no secret that the department has trouble recruiting top technical talent. However, this isn’t because talented engineers are repelled by low salaries. The U.S. Digital Service has shown that top talent is attracted to the hardest problems . The Defense Department similarly has hard problems that would challenge world-class experts and hiring mechanisms to bring them on for short-term roles. The problem is that right now it cannot provide the requisite tools (cloud, labeled data, agile development and operations methodologies, and machine learning infrastructure) these engineers are accustomed to using.

4) Integration teams or ‘Maven-like’ efforts to integrate, iterate, and scale

The Defense Department faces less of a challenge in generating new advances in artificial intelligence than in creating a culture in which existing advances from the private sector can be integrated into military programs and operations. Project Maven has received a lot of media attention as one of the U.S. military’s primary efforts to operationalizing AI. A small Defense Department team of civilians and activated reservists partnered with industry experts to deploy computer vision models to operating forces. The military needs more of these teams: industry engineers, academic experts, users, acquisition experts, and systems integrators working together to develop solutions, test and evaluate them at speed, iterate, redeploy, and eventually integrate into programs of record to be sustained in the long term.

However, the question remains whether this project, and others like it, can be scaled successfully into programs of record , especially after the project loses the watchful eye and bureaucracy-cutting power wielded by previous Deputy Secretary of Defense Bob Work and his successor, Patrick Shanahan. The Department of Defense won’t be able to integrate AI capabilities into dozens of programs of record at once if each integration requires direct oversight and support of senior leaders. The technical solutions exist, but the will and buy-in to make them flourish does not.

5) A culture

The most difficult shift will be changing the way the department thinks: from large, long-term, siloed “waterfall” projects to smaller, quicker, collaborative agile methodologies, from the years-long contracting of the Federal Acquisition Regulation to the rapid contracting of Other Transaction Authorities, and from permitting innovation at the margins to adopting innovation at scale. This goes beyond artificial intelligence. The Defense Department has a culture of stewardship rather than a culture of audacity. We tell ourselves we are the greatest military in the world today instead of changing to make sure we are the greatest military tomorrow.

A whole-of-military effort to embrace AI requires new thinking, which in turn requires educating military personnel to a level of basic AI literacy, whether it is technical, tactical, or strategic. Warfighters need to learn to look for solutions to problems through an AI lens, program managers must learn to move quickly and accept risk, and research labs must be willing to accelerate their basic research and use mechanisms that can operationalize their work. Finally, the defense industrial base must learn to see agile companies that provide novel services as an opportunity rather than a threat.

Conclusion

The Defense Department is in an unenviable position. It’s an organization that isn’t prepared for change but direly needs it. But change isn’t a flashy headline or a new innovation arm; it’s a new structure for operating — technical and cultural. Whether the department wants to use artificial intelligence for a few narrow mission sets in logistics and intelligence gathering, or one day seamlessly integrate it — like the combustion engine, electricity, or computers — into the background of all military missions, it must first build a foundation. That means not only a technical backbone of data, storage, and computing power to harness AI, but also a cultural shift away from the slow, antiquated way of doing acquisitions.

The hard-to-swallow truth of the AI race is this: If the United States loses, it will be of its own design. America’s technology industry is the most innovative in the world and has already tackled many of the AI use cases that the military seeks to adopt. The technical solutions exist, but a willingness to adopt them does not. America needs to not just research, but also prototype and integrate, the most advanced technologies if it is to move beyond victories in newspaper headlines and move toward victories on future battlefields.

Lieutenant Junior Grade Richard Kuzma is a graduate of the Harvard Kennedy School, where he was a Belfer Center for Science and International Affairs Student Fellow. He wrote his thesis on how DoD should structurally change to implement artificial intelligence. He works at Defense Innovation Unit Experimental (DIUx) where he led data creation on xView, one of the largest overhead imagery datasets in the world. He is curating a machine learning self-study program for members of DoD as a Defense Entrepreneurs Forum Firestarter Fellow. The views expressed here as his own and do not reflect those of DoD, the Navy, or DIUx.

Image: DARPA