If you thought slow-mo videos of watermelons being exploded by bullets and chubby, jowled faces recoiling from slaps was as good as it gets, think again: MIT Media Lab has created a camera-and-laser rig that’s capable of capturing one trillion exposures per second; fast enough to visualize the speed of light and to see the effect of photons as they collide with surfaces.

At this point you should probably watch the video below, because it probably explains the setup a lot better I can manage. If you can’t watch it, or you simply want to see me squirm, here’s the gist of it: Basically, nanosecond laser pulses are shone on an object. In front of the camera is a narrow slit, so that only a thin slice of the laser light can be seen at one time — the technical name for this device is a “streak camera.” The laser pulses, with very complex timing circuitry, are then picked up by an array of 500 sensors in the camera — but only one “scan line” at a time (thanks to the narrow slit). Using mirrors, the camera’s angle of view is changed over time until each of these one-dimensional slices can be built up into a complete 2D image. This process, which takes about an hour, has led to one of its creators — Ramesh Raskar — to dub this trillion-FPS wonder “the world’s slowest fastest camera.”

I know what you’re going to ask next — What is the point? — and to be honest, there isn’t really an answer. The only immediate application is material analysis: Much in the same way that ultrasound is used to detect flaws in metal, the scattering of photons could also tell us a lot about materials — but whether it’s better or more cost effective (the MIT setup costs $250,000) remains to be seen.

Raskar also suggests that the streak camera + laser could lead to breakthroughs in lighting technology — as in, studio lights and flash units in point-and-shoot cameras. When you can see the exact path that photons take to create a certain effect, you are then one step closer to creating a cheaper simulacrum with ghetto gear — or in software post-processing.

Read more at MIT, and Ramesh Raskar’s website