Stanford University researchers funded by DARPA’s Neuro Function, Activity, Structure, and Technology (Neuro-FAST) program have developed new optical imaging and analysis techniques that allowed them to decode the neural activity of awake mice engaged in an adaptive, decision-making task. The findings of the Stanford team, made in collaboration with researchers at the California Institute of Technology and detailed this week in the journal Neuron, give researchers new insight into how the mammalian brain coordinates neural activity to complete voluntary behaviors. The team’s overall results advance DARPA’s goal of building a knowledge base and toolkit with which the neurotechnology community can accelerate understanding of brain structure and function.

“DARPA created the Neuro-FAST program to find new ways to see the brain, and the optical technologies we’ve developed now allow researchers to observe the brain in detail as it processes behavior,” said Justin Sanchez, the DARPA program manager. “This achievement helps us to identify the roles played by individual neurons in coordinating and carrying out behaviors, and I think it could be a cornerstone of future neural interface technologies.”

In this set of experiments, transgenic mice learned to lick for water in the presence of a particular odor or refrain from licking in the presence of a second odor to avoid a puff of air. This behavior required the animals to integrate immediate stimuli in the form of an odor, their own internal state, and past experience to rapidly and flexibly select and perform specific actions that helped them achieve the particular goal of receiving water.

The researchers, led by Dr. Karl Deisseroth, Will Allen, and Isaac Kauvar, combined whole-cortex widefield calcium imaging—a method in which genetically modified neurons fluoresce when activated—with tiled two-photon microscopy that provided more detailed views of particular areas of the cortex. This pairing allowed them to record across almost the entire neocortex of the mice’s brains, yet with enough resolution to reveal the spatial and temporal dynamics of individual neural cell types and neural circuits. Previous single-cell imaging of this type—and even intracortical recordings using electrodes—had been limited to only a few areas of the brain at a time in mammals, missing the insights that could be garnered from a global view.

The experiments revealed that neural circuits and different cell types distributed across much of the neocortex play a part in even relatively simple adaptive behaviors. But they also confirmed the hypothesis that the prefrontal cortex plays a key role in coordinating the planning and execution of such behaviors, effectively broadcasting the animal’s behavioral state to local populations of neurons throughout the brain.

That finding created a headache for the researchers, however. Many cells throughout the cortex seemed to similarly code task engagement, but they did so without any apparent specific correlation to behavioral output—that is, the choice made by the mice. To begin to unravel the mechanisms behind this “global brain state,” the Neuro-FAST team adjusted their experiments and developed new equipment to allow for imaging and manipulation of specific types and subsets of neurons across the entire cortex. First, using different colors of light they imaged the activity of only certain types of neurons while the mice engaged in the task. Then, using optogenetic and pharmacological interventions, they systematically inhibited the activity of specific cell types and regions of the brain over a series of trials.

By layering the high-resolution results of these trials, the researchers were eventually able to decode the spatial and temporal patterns that resulted in the mice’s decision to lick. They found that throughout the cortex, activity in single cells, cell types, and neuropil—the dense network of connections that make up much of the brain’s gray matter—signaled distinct aspects of task-related information. This activity was partially separate from the animals’ behavior, implying it represented task information, rather than simple sensorimotor feedback, which is highly specific to certain regions of the brain.

“There’s a very long way to go, but what this research begins to teach us is how the brain processes and executes goal-directed behaviors in higher-order species,” Sanchez said. “DARPA has a long-term goal of developing systems to help Service members better manage the sophisticated systems and complex scenarios that increasingly define modern military service. Neuro-FAST aims to unlock one piece of that puzzle by giving us the knowledge and tools to begin making sense of neural circuitry and coding.”

Future research will attempt to advance understanding of how widespread task-related information is coordinated throughout the cortex. This research could involve comparing cortical activity across a variety of behaviors involving different actions, expectations, and sensory modalities to determine how cortical activity is remapped for different goals.

Image Caption: This three-part image shows (left to right): 1) Surgical preparation to record single-cell activity from across cortex; 2) Widefield fluorescent image of a mouse brain taken through a 7-millimeter window used to expose the dorsal cortex for two-photon imaging. The different colored dots represent different fields of view acquired sequentially in different sessions of approximately 30 trials per session; 3) The insets show the maximum projection of fluorescence from single fields of view in layer 2/3 of the brain, acquired with two-photon microscopy.

# # #

Media with inquiries should contact DARPA Public Affairs at outreach@darpa.mil

Associated images posted on www.darpa.mil and video posted at www.youtube.com/darpatv may be reused according to the terms of the DARPA User Agreement, available here: http://go.usa.gov/cuTXR.

Tweet @darpa