“Epic” was the word Regina Dugan used to describe her team’s research and development projects that included enlisting “Fast and Furious” movie franchise director Justin Lin to help create the next-generation movie experience. Dugan, vice president of Google’s Advanced Technology and Projects (ATAP) group, delighted thousands of developers in May at Google’s annual I/O conference as she orchestrated demonstrations of applied technologies that seemed to originate from just over the horizon of most humans’ imaginations.

The 36 pages of results when you Google her name confirms Dugan’s fame, making kudos redundant, but reporting where she came from is important because it explains ATAP’s role within Google.

Dugan’s last job was head of the Defense Advanced Research Projects Agency (DARPA), a 50-year-old Department of Defense research agency with a budget of about $3 billion that’s charged with preventing strategic surprises from – and creating strategic surprises for – America’s adversaries. DARPA earned a reputation for producing high impact results quickly. A few of DARPA’s innovations include the internet, global positioning satellites (GPS), drones and micro-electro-mechanical systems (MEMS).

DARPA isn’t a monolithic government agency. It doesn’t have its own labs or a large R&D staff. Instead, the Agency recruits teams of highly accomplished technical leaders, usually PhDs and experts in their fields to work on short three-to-five year projects with university and industry partners. The goals are ambitious … such as building a hypersonic test vehicle to fly at mach 20 (15,200 mph). DARPA doesn’t build products – it proves or disproves the feasibility of building a solution that could be developed into a product. This is an important distinction that explains DARPA’s role: Success in building a prototype gives tangible evidence that a strategic end-product could be built.

Dugan joined Google to create the ATAP group, where she could apply the DARPA model to speed Google’s strategic research projects. She traded the Department of Defense’s deep pockets for Google’s, and exchanged fighting America’s adversaries for fighting perplexing product development challenges.

[Related: 9 most important announcements at Google I/O 2015]

Dugan’s first demonstrations addressed the unique user interface (UI) problems posed by smartwatches due to the small size of wearables. The tiny or in some cases non-existent wearable screens essentially call for new UIs.

ATAP applied trusted radar radio technologies to controlling another device with a touchless UI that interprets fine finger movements and hand gestures made in the air. The prototype device shown at I/O 15 captured the radar reflection of a movement and applied machine learning for accurate recognition. Any movement interpreted by the radar UI can be a metaphor for any input such as up, down, make a call or take a photo. An example one such metaphor was demonstrated with a thumb and forefinger twirling motion in the air that reset the time on a digital clock display. The concept is explained in a few words and a roughly minute-and-a-half video:

Cut from whole cloth

A second demonstration of a wearable product – code-named Project Jacquard – how to weave a multi-touch input panel like a mouse pad into regular cloth using existing textile industry’s processes. Now, multi-touch has been around since IBM first experimented with it in the 1960s, and it’s now used in every touchpad and touch screen. No need for ATAP’s high-powered R&D for that. Redesigning multi-touch so that it could be produced by the textile industry at scale, however, is right in ATAP’s wheelhouse.

ATAP’s Ivan Poupyrev described the route that began with hand weaving conductive yarns into cloth to make a prototype multi-touch panel. He stepped through the collaboration with textile industry partners to redesign the hand-made prototype so that it could be made in textile production plants using unmodified (legacy) spinning and weaving equipment.

The path to prove feasibility of Project Jacquard reached its destination when a multi-touch panel was woven into cloth in a textile factory that was then shipped to a London Saville Row tailor and sewn into a jacket. As conclusive proof, a telephone call on a smartphone was made with a swipe of the jacket sleeve.

Poupyrev made an important distinction that Project Jacquard demonstrated feasibility and not specific applications. Those he hoped would be engineered by software developers and tailored into fashion by designers creating new applications for soft e-textile computing. Poupyrev proved that if Google wanted to, it could turn over ATAP’s design and manufacture at scale, like the U.S. Department of Defense could take a DARPA design into production.

How a woven multi-touch panel works is explained in this 94-second video: