Just a few weeks ago, we posted about how brain patterns can reveal almost exactly what you're thinking. Now, researchers at UC Berkeley have figured out how to extract what you're picturing inside your head, and they can play it back on video.

The way this works is very similar to the mind-reading technique that we covered earlier this month. A functional MRI (fMRI) machine watches the patterns that appear in people's brains as they watch a movie, and then correlates those patterns with the image on the screen. With these data, a complex computer model was created to predict the relationships between a given brain pattern and a given image, and a huge database was created that matched 18,000,000 seconds worth of random YouTube videos to possible brain patterns.

With this database in place, the Berkeley research group was then able to feed brain scans into their computer model, which would go pick out the 100 video clips that most closely matched the brain pattern on a second-by-second basis. All of these best-guess clips were smushed together into a single video, and when these videos were compared to the original clips that the test subjects were watching, there was a substantial, if slightly surreal, correlation:

Comparing the brain-scan video to the original video is just a way to prove that the system works, but there's nothing stopping this technique from being used to suck video out of people's heads directly. With a bit of refinement of the algorithm and sensor resolution, you could go stick yourself into an fMRI machine, close your eyes, picture a duck, and the machine would be able to project an image of a duck (or something duck-like) onto a screen. You could even fall asleep in one of these, and record video of your dreams.

Of course, from a slightly more sinister angle, it's also possible someone could put you in an fMRI machine and suck your memories out. And while the researchers comment that this technology is "decades from allowing users to read others' thoughts and intentions," they're also saying that yeah, in a decade or two, they'll be able to do that.

Berkeley, via SciAm and io9

For the latest tech stories, follow us on Twitter at @dvice