Presenting smells during sleep can help with memory tasks (Image: Scott MacBride/Getty Images)

Talking in your sleep might be annoying, but listening may yet prove useful. Researchers have shown that sleeping brains not only recognise words, but can also categorise them and respond in a previously defined way. This could one day help us learn more efficiently.

Sleep appears to render most of us dead to the world, our senses temporarily suspended, but sleep researchers know this is a misleading impression.

For instance, a study published in 2012 showed that sleeping people can learn to associate specific sounds and smells. Other work has demonstrated that presenting sounds or smells during sleep boosts performance on memory tasks – providing the sensory cues were also present during the initial learning.


Cat or hat?

Now it seems the capabilities of sleeping brains stretch even further. A team led by Sid Kouider from the Ecole Normale Supérieur in Paris trained 18 volunteers to classify spoken words as either animal or object by pressing buttons with their right or left hand.

Brain activity was recorded using EEG, allowing the researchers to measure the telltale spikes in activity that indicate the volunteers were preparing to move one of their hands. Since each hand is controlled by the motor cortex on the opposite side of the brain, these brainwaves can be matched to the intended hand just by looking at which side of the motor cortex is active.

Once the volunteers had repeated the task enough times for the process to become automatic, they were taken to a bed in a dark room. Here, they were instructed to continue the task as they drifted off to sleep.

Once the EEG recording confirmed they were asleep, the researchers presented the volunteers with a new set of words. The volunteers brains’ continued to respond in the same way – preparing to make the movement appropriate to each word’s category, even though they were no longer moving

their hands. Fresh words were introduced to ensure that the volunteers were still analysing the words’ meanings rather than merely responding to learned associations.

Automatic for the people

“This opens the door to a lot of questions about how much linguistic processing happens during sleep,” says Ken Paller at Northwestern University in Evanston, Illinois, who is investigating whether it is possible to implant false memories during sleep. “That’s unexplored territory.”

Kouider suggests this unconscious processing is possible because the task can be automated in a way that bypasses the prefrontal cortex, a region known to be heavily suppressed during sleep. “When you sleep, some brain regions sleep, while others remain totally awake,” he says. “Sleep is much more local than previously believed.”

This hints at what the limitations of unconscious processing might be. The prefrontal cortex is critical for executive functions such as planning, problem-solving and task-switching. “When you have two tasks you have to switch between, I’m not sure you could do that [in your sleep],” says Kouider.

On waking, the volunteers weren’t able to recall any of the words they processed while asleep but Kouider’s group is now investigating whether the approach can be extended so that new information is retained. “If you have a learning procedure, if it’s automatised enough, and if it’s simple, you might be able to learn it even during sleep,” he says.

The team is also investigating more complex linguistic processing. “We’re now looking at whether you can process a full sentence while sleeping, and detect whether it’s meaningful or not,” he says. “Or whether you can even pull out information relevant to the sleeper from a mixture of voices.”

Journal reference: Current Biology, DOI: 10.1016/j.cub.2014.08.016