Any design nerd, futurist, or techie worth his weight in salt has heard of the MIT Media Lab. Few, however, have heard of the Fluid Interfaces Group. No, it’s not a smooth jazz outfit—it’s a division of the famous Media Lab, and home to some of the niftiest display prototypes and interface designs this side of the Mississippi.

Headed by Media Lab professor Pattie Maes, the Group claims its goal is to develop displays and interfaces that are a “more natural extension of our minds, bodies and behavior,” with a special interest in applications that augment "learning, understanding, decision-making and collaboration.”

Fluid has been around for at least seven years, but recent advances in mobile, sensor, and display technology seem to have inspired a wealth of breathtaking new projects. Here are seven recent ideas that offer a glimpse at the future of interface technology.

ADVERTISEMENT

SpiderVision bills itself as a solution to man’s limited field of view, which is apparently inadequate. With an Oculus Rift head mount display (HMD), camera, and wide angle lens, SpiderVision provides a 360-degree view of your surroundings. It accomplishes this by blending light from the rear camera into the front. The coolest part is that it only integrates the rear view when activity is detected, freeing up the user’s perception for whatever task is most important.

Okay, so with a burly Oculus Rift console strapped to your head the product loses a bit of its luster. But it’s just a concept. Imagine the same idea applied to a much smaller, hardly noticeable HMD.

Fluid projects are most impressive when you imagine how they’ll mature into the future, and the Augmented Magnifier is the perfect example. The transparent display/magnifying glass quickly renders a 1000x magnification of a fruit and overlays the image with the scientific name, common name, and molecular level magnification.

We’re not far from smartphones with built-in microscopes. It’s not difficult to imagine even broader functionality in a handheld device.

This has a truly amazing range of possibilities. FingerReader literally scans, reads, and transcribes printed text as users trace their finger (or, more precisely, the FingerReader) across a line of text. It could be used to assist visually impaired readers, facilitate text transcription, or to translate texts in a foreign language. It’s also not difficult to imagine a similar device working on electronic displays.

How about a car that can detect your emotions and respond appropriately? Through a series of installed sensors and cameras, AutoEmotive can tell how distracted drivers are based on their facial gestures, steering wheel grip, and other factors. With this information, the system can adjust headlight field-of-view, navigator tone of voice, music, and map directions accordingly. The startup even has a prototype showing how thermochromatic paint could be used to dynamically change the color of the car to warn other drivers on the road.

Of cource, if driverless vehicles take off—which they certainly will—none of this will matter much. But it would be nice to have an on-board computer that can make music suggestions based on your mood.

ADVERTISEMENT

This one is too over my head to appreciate its use, but it still looks awesome. Basically, Clearcut is a laser cutter that uses a transparent display to directly scan objects and graphics, and then cut the materials accordingly. Users can draw and edit graphics directly onto the display, or make virtual copies of physical objects. Here’s an idea that shows the Media Lab’s foundation in architecture and design.

Here’s how augmented reality will work with the Internet of Things. We’ve seen a lot about AR, particularly in the form of Google Glass, but we haven’t really seen how it can be used to enhance or conceal controls of everyday objects. Smarter Objects projects a graphical interface onto a mobile AR device and allows users to remotely control simple objects like radios, doorknobs, lighting systems, and more. This is one of those things where you need to see it in action to appreciate it.

Glassified is just a ruler with a built-in transparent OLED display. Why would anyone want a ruler with a built-in transparent OLED display? Because a digtizer on the device recognizes pen strokes and incorporates it into the ruler’s tiny computer. This allows users to seamlessly and spontaneously interact and measure their own scribblings. It could be revolutionary for engineers and architects, or it could be an entirely new gaming platform. Most likely, something like it will eventually find its way into a smartphone.