The Ninja Brain: Humans Can Prioritise Meaningful Sounds Even While Asleep

By Matthew Warren

We often think of sleep as a chance to switch off from the outside world, leaving us blissfully ignorant of anything going on around us. But neuroscience research has shown this is a fantasy – we still monitor the environment and respond to particular sounds while we’re sleeping (at least in some stages of sleep) – a fact that will be unsurprising to anyone who has woken up after hearing someone say their name.

Now a study published in Nature Human Behaviour has revealed more about the brain’s surprisingly sophisticated levels of engagement with the outside world during sleep. Not only does the sleeping brain respond to certain words or sounds – it can even select between competing signals, prioritising the one that is more informative.

For obvious reasons it’s a challenge for researchers to figure out what people are paying attention to while they’re asleep. Guillaume Legendre at the École Normale Supérieure in Paris and his colleagues overcame this problem by looking at the changing patterns of their volunteers’ brainwaves using EEG (electroencephalography, which uses scalp electrodes to record the brain’s electrical activity).

The team recruited 24 French participants to complete a series of listening tasks while awake and asleep. In the initial part of the experiment, the team recorded the participants’ EEG signal while they listened to one-minute-long excerpts of speech. Some of these excerpts were taken from real news reports, stories, movies, and Wikipedia entries. Others were from passages of so-called “Jabberwocky” text, which had normal sentence structure but with content that was gibberish (as in Lewis Carroll’s nonsense poem of the same name: “’Twas brillig and the slithy toves / Did gyre and gimble in the wabe..”).

The team then trained a computer algorithm to learn how the participants’ brainwaves varied according to the audio they’d heard. After this training, the researchers could present the algorithm with a participant’s brainwave recording and it could reconstruct the audio signal that they’d been listening to at the time.

Next, to see how the brain responded to two competing inputs, the researchers analysed the EEG brainwave data when participants heard both the real and nonsense texts simultaneously, one playing in each ear (this started while they were awake and were focused on the meaningful speech, and it continued after they fell asleep).

The researchers’ computer algorithm was able to reconstruct both of the audio signals from the participants’ brainwaves, suggesting that they were processing both forms of speech even while asleep. And crucially, during both wakefulness and sleep, the reconstruction was better for the signal from the meaningful text than the nonsense text, suggesting that the brain had been “amplifying” the meaningful story in some way so that it left clearer traces in the EEG data.
When participants were awake, 60.6 per cent of trials were reconstructed better for the meaningful text, while this number decreased to 52.4 per cent of sleep trials. But the smaller amplification effect during sleep was still significantly higher than the 50 per cent that would be expected by chance had the brain no longer been able to prioritise the meaningful speech.

Several factors appeared to influence whether or not the EEG signal for meaningful stories was amplified during sleep. Depth of sleep was important: meaningful stories were only amplified during light sleep (stage 2) but not deep sleep (stage 3). And the effect appeared only temporary, occurring during the first 30 seconds of a story, but not the last 30 seconds.

Overall, the researchers write that their experiment provides evidence that people can selectively process meaningful or important information during sleep – a ninja-like ability which could have evolutionary advantages. “In an ever-changing environment, the ability to process relevant signals during light sleep offers substantial benefits,” they write. “In particular, it would allow signalling the presence of events, necessitating a rapid reversal towards wakefulness.”
That said, the bias towards processing the meaningful stories during sleep appeared very small, so it remains unclear how important it would be in a real-world environment. And because participants had been told to pay attention to the meaningful story, it may be that the amplification effect was driven “top-down” by their attention rather than by the meaningfulness of the stimulus per se. Nevertheless, the study makes it clear that the brain’s engagement with the outside world during sleep is far more complex than we often recognise.

Leave a Reply

Your email address will not be published. Required fields are marked *