Every day, the world unfolds much like a movie does. Yet no two people experience the world in quite the same way. Visually, it's as if everyone were watching slightly different versions of the same movie.
Researchers at the University of California, Berkeley have just taken a major step in decoding and reproducing just what people are seeing, using brain scans taken while three of the researchers watched movie clips. And while the technique only works right now for clips from actual movies, down the road it may be able to show just what people are seeing and thinking in their everyday mental movies — like mindreading.
Reconstructing movie clips is just a taste of things to come. It brings us one step closer to the day when we read the brain and be able to tell what our loved ones and not so loved ones are seeing and thinking, for better or for worse.
In the past, researchers were only able to decode people's visual images of stationary objects, like a chair, by using functional magnetic resonance imaging (fMRI). These brain scans record the amount of blood oxygen in specific areas of the brain, which increases as an area becomes more active. But these changes are relatively slow compared to changes in activity of the nerves themselves. To obtain imaging information that can decode and reproduce moving objects also requires information on the fast changes that are occurring within nerve cells.
Because the study involved sitting for hours inside a scanner and watching movie clips, three members of the research team served as subjects.
The subjects each watched two hours of movie clips while undergoing an fMRI scan. The scan data was fed into a computer program that learned, second by second, the correlation between the fMRI scan signals and the image that a subject was watching at the time.
The subjects then watched a second, different set of movie clips, nine minutes total, with each clip repeated ten times. Using information from the first set of scans, the computer program was able to decode what clips the subjects were watching from the brain scan data with good accuracy.
Decoding was done by having the computer program pick from 5,000 hours (one million clips) of random movie clips taken from the Internet. The program then averaged images from the 100 clips that best resembled the scan data from the subject. The result was a single blurry but recognizable reconstruction of what the subject had actually been watching — a successful decoding.
The ability to reproduce what a person is seeing has many possible applications, from communicating with people unable to express themselves to changing the entire dating experience and beyond. Reconstructing movie clips is just a taste of things to come. It brings us one step closer to the day when we read the brain and be able to tell what our loved ones and not so loved ones are seeing and thinking, for better or for worse.
An article detailing the study was published online by Current Biology on September 22, 2011 and will also appear in a future print edition of the journal.