The first phase of Project Maven, which incorporates multiple teams from across the Defense Department, is an effort to automate the identification and classification of images taken by drones — cars, buildings, people — providing analysts with increased ability to make informed decisions on the battlefield.

The race to adopt cutting-edge AI technology was announced in April 2017 by then-Deputy Defense Secretary Robert Work, who unveiled an ambitious plan called the Algorithmic Warfare Cross-Functional Team, code-named Project Maven . The initiative, Work wrote in an agency-wide memo, is designed to “accelerate DoD’s integration of big data and machine learning” and “turn the enormous volume of data available to DoD into actionable intelligence and insights at speed.”

The team, The Intercept has learned, is working to develop deep learning technology to help drone analysts interpret the vast image data vacuumed up from the military’s fleet of 1,100 drones to better target bombing strikes against the Islamic State.

Google, which has made strides in applying its proprietary deep learning tools to improve language translation, and vision recognition, has a cross-team collaboration within the company to work on the AI drone project.

The contract, first reported Tuesday by Gizmodo, is part of a rapid push by the Pentagon to deploy state-of-the-art artificial intelligence technology to improve combat performance.

The military contract with Google is routed through a Northern Virginia technology staffing company called ECS Federal, obscuring the relationship from the public.

Google has quietly secured a contract to work on the Defense Department’s new algorithmic warfare initiative, providing assistance with a pilot project to apply its artificial intelligence solutions to drone targeting.

“The technology flags images for human review, and is for non-offensive uses only,” a Google spokesperson told Bloomberg. “Military use of machine learning naturally raises valid concerns. We’re actively discussing this important topic internally and with others as we continue to develop policies and safeguards around the development and use of our machine learning technologies.”

The idea is to essentially provide a recommendation tool, so that the AI program can quickly single out points of interest around a type of event or target so that drone analysts can work more efficiently.

The department announced last year that the AI initiative, just over six months after being announced, was used by intelligence analysts for drone strikes against ISIS in an undisclosed location in the Middle East.

Gregory C. Allen, an adjunct fellow with the Center for New American Security, says the initiative has a number of unusual characteristics, from its rapid development to the level of integration with contractors.

“The developers had access to the end-users very early on in the process. They recognized that [with] AI systems … you had to understand what your end-user was going to do with them,” Allen said. “The military has an awful lot of experts in analyzing drone imagery: ‘These are the parts of my job I hate, here’s what I’d like to automate.’ There was this iterative development process that was very familiar in the commercial software world, but unfamiliar in the defense world.”

“They were proud of how fast the development went, they were proud of the quality they were getting,” added Allen, co-author of “Artificial Intelligence and National Security,” a report on behalf of the U.S. Intelligence Advanced Research Projects Activity.

While the contract with Google has gone unreported until today, Project Maven leaders have not been shy about their push to partner with Silicon Valley and harness the growing reach of commercial AI technology.

Not long after the formation of the Defense Innovation Board, which was created in 2016 to encourage the military adoption of breakthrough technology, the board released a set of recommendations that stressed the importance of adopting artificial intelligence and machine learning, stressing that technological superiority with AI is as important as “nuclear weapons in the 1940s and with precision-guided weapons and stealth technology afterward.”

The DIB — which is chaired by Eric Schmidt, former executive chair of Alphabet, Google’s parent company — recommended “an exchange program and collaboration with industry and academic experts in the field.”

Lt. Gen. John N.T. “Jack” Shanahan, director for defense intelligence overseeing Project Maven, joked at the GeoINT2017 conference that he hoped Google would start sharing more of what it knows with the Pentagon. “On the far end of the scale, you see Google. They don’t tell us what they have, unless anyone from Google wants to whisper in my ear later,” he said.