As a US military physician, I have always understood the strange interplay of conflict and science. In the 20th century, our first steps on the moon would not have been possible without the design of the Nazi-designed V2 rocket. Our understanding of the universe comes from our ability to escape the atmosphere, and this comes from our history of war.

In the 21st century, conflict has again driven scientific progress, but instead of gazing upwards and unlocking the mysteries of the sky, we have instead looked inwards, at the brain.

My time in the field of neurotechnology has been driven by the early conflicts of the millennium. They grew from methods of treating and repairing soldiers injured in battle. If primates were able to move cursors across a screen through the power of their thought, we wanted to see if we could control an advanced prosthetic arm in the same way. In under a decade, we had pushed the science far beyond what was thought possible in 2004. Today, we have not only demonstrated the ability to control an external robotic limb through signals from a human brain, but we have restored the sensation of touch to an individual with severe neuro-degenerative injury.

We are at the precipice of a truly transformative time for neuroscience. Reflecting on the past, let me make some predictions on the future.

All modern economies are derived from astronomy. This may sound strange at first, but think back: without astronomy, we would not have the ability to determine our location on Earth. We would not have cartography or navigation, or trade. We used the stars to make maps, and we used maps to chart the modern world.

The modern economy also owes a great deal to our limited understanding of the human mind. Advertising, marketing, focus groups – all rely on psychology, on a mechanistic understanding of how it is that we think and make decisions. So too do negotiation, and ethics and law.

But what we know about the mind is very different from what we know about the stars. The telescope was a tool that allowed for the continual observation of millions of stars. As technology has improved, we have expanded our field of view. To look inwards at the mind, we’ve had to rely on techniques that are much more limited. It’s as though we have been trying to count stars by listening for them, rather than observing them directly, and so our maps have been limited.

While working at DARPA (Defense Advanced Research Projects Agency), I had the fortune of working with Karl Deisseroth, a professor from Stanford. Karl developed a method called Clarity, that allows us to create amazing maps, neuron by neuron, at a level of detail that we have never seen before.

In the last decade we’ve gone from indirect to direct observation. We can now count thousands of neurons in real time as they fire, and then generate maps of millions more. These will be our star charts, and we can borrow techniques from astronomy to navigate the properties of the billions of neurons that make up the human brain.

As we make maps of the mind, and chart ourselves, I believe that we will fundamentally uncover better ways to understand how people decide and think, and how to communicate with each other. Clarity, and similar technologies, will generate a revolution in how we know ourselves.

A second major shift in our understanding of the brain will come through the dissemination of technology. In 2004, when I began at DARPA, deep-brain stimulation had been in use in the United States for less than a decade. Today, over 100,000 people around the world are living with an implanted device that improves their control over severely impairing diseases. More than 300,000 hearing-impaired people around the world have a cochlear implant that allows them to experience sound, speech and music.

In 2004, one of the most advanced mobile phones you could buy was the Blackberry 7210, the first model to include colour display. But then Steve Jobs gathered a small team of Apple colleagues together to start “Project Purple”. This was the beginning of the iPhone. In neurotechnology, we are waiting for our iPhones. Once we have them, I believe the world will change.

But it isn’t just about our brains. Going directly to an understanding of ourselves through our brains is an obvious starting point. Ever since Phineas Gage’s unfortunate encounter with a railroad spike, we have known that changes in our brains result in changes to our minds and to ourselves. What we’ve learned, and emphasized more recently, is that changes to ourselves can result in changes to our brains. The link between malnutrition and depression is well established, but the link between our brains and the species of bacteria in the microbiome in our stomachs is revelatory.

Source: DARPA

These efforts to characterize the body’s information highways will lead to new therapies not simply through devices, but through new drug targets. Ideally, they will lead to a better understanding of the integration between mind and body, and give us a better understanding of how traditional and alternative medical techniques (such as yoga or meditation) lead to profound outcomes in cellular diseases like cancer.

The 20th century was the century of the sky and the satellite, the revolutions in micro-electronics and computer technology, in-control theories and big data. We will build on the lessons learned as we continue into the 21st century, the century of biology.

The Summit on the Global Agenda 2015 takes place in Abu Dhabi from 25-27 October

Author: Dr Geoffrey Ling is the Founding Director of the Biological Technologies Office at DARPA and a member of the Senior Executive Service (SES). He served as an officer in the US Army Medical Corps for 27 years before retiring as a Colonel in 2012. Member of the World Economic Forum Global Agenda Council on Brain Research and Meta-Council on Emerging Technologies.

Image: A laser-etched lead crystal glass artwork by Katherine Dowson entitled Memory of a Brain Malformation is seen at an exhibition at the Wellcome Collection in London March 27, 2012. REUTERS/Chris Helgren