Science & Research

Researchers map brain activity to read dreams

The study matched patterns in subjects’ neural activity with imagery in their dreams

By
Contributing Writer

The science fiction trope of reading another person’s thoughts may be one step closer to reality, thanks to recent developments in research related to the decoding of dreams.

In a study published last Thursday in the journal Science, a joint team of researchers from Japan and Brown used a machine-learning paradigm to decode the brain activity of subjects during early sleep stages and reliably determine the content of their dreams.

During the study, which lasted about three years, the neural activity of three subjects in early sleep was recorded using functional MRI, which localizes brain activity to particular brain areas by tracking blood flow. The researchers also used EEG, which measures electrical activity along the scalp, to understand brain activity at specific times to determine the subject’s stage of sleep.

Though dreaming is often associated with REM sleep — a state in which the body is paralyzed but the brain is highly active, like in waking moments —  dream-like images known as hypnagogic hallucinations are often associated with early non-REM sleep.

The researchers awoke the subjects every six minutes and asked them to report the general content of their dreams.

“We had to get as many dream samples as possible,” said Masako Tamaki, a post-doc in the Department of Cognitive, Linguistic and Psychological Sciences and the secondary author of the study.

Subjects’ responses were typically muddled due to their disoriented states, according to the study. The experimenters categorized the images broadly because they did not have the technology to examine specific details about the dreams.

While awake, subjects viewed images that fell into the same categories they had reported dreaming about, such as human faces or furniture. While they viewed the images, the researchers recorded their neural activity.

The team was able to use the data to construct a computer algorithm for each individual that paired particular brain activation states with corresponding visual imagery, with an accuracy of 60 percent, according to the study.

“There’s something in common between what goes on in dreaming and what goes on in perception,” said Jack Gallant, an associate professor of psychology at the University of California, Berkeley, in an article about the study published in the Los Angeles Times last week.

Tamaki said she hopes this research can eventually be expanded so that researchers can create more generalized algorithms to decode the dreams of a large number of people. Such a decoding mechanism could potentially help reveal the unconscious thoughts of coma patients, she said.

“The strides that are being made in this type of science are remarkable,” said Mary Carskadon, professor of psychiatry and human behavior at the Alpert Medical School and director of sleep research at Bradley Hospital, who was not involved in the study. “I find this whole new world of computational analysis to be so interesting and to have potential … for doing some really good things,” she said.

But Carskadon added there could also be a “dark side” to this type of research, noting that companies like Google and Facebook use similar methods of computational machine learning to analyze data to predict human actions to their advantage.

This research is only a preliminary step toward an understanding of the function of the vivid dreaming that occurs during REM sleep, which still remains an enigma in sleep science research, Tamaki said.

Both Tamaki and Carskadon said studying dream content during REM sleep poses several challenges, due to increased difficulty of obtaining a sufficiently large sample of REM trials, the unreliability of image content reports obtained from REM sleep and the difficulties associated with keeping the subjects asleep while in an fMRI machine, which is about as loud as an airplane engine.

Topics: