Furniture has traditionally been a static thing. We sit at our tables, in our chairs that hold their stiff, rigid shape no matter what we’re doing or how we’re feeling. As our homes become smarter and more personalized, furniture has almost wholly been left out of the revolution.

It’s a shame. Just imagine if your sofa could sense how you’re feeling when you get home from work. To stave off marathon TV sessions, it could transform from a cushioned pile of pillows to a rigid lounge as encouragement to go outside and move around. This exact shape-shifting scenario is an unlikely reality, but a new project from MIT’s Tangible Media Group envisions more realistically what might happen when our furniture is finally able to respond to us.

Called Transform, this table-like structure metamorphoses based on the motions and emotions of the humans around it. Developed by Sean Follmer, Daniel Leithinger and Hiroshi Ishii, the magical device was on show at the Lexus Design Amazing display during Milan Design Week.

The team describes Transform as a table, though you’d have to be hard-pressed to eat dinner at it. The rectangular object is made of 1,152 plastic pins that are controlled by individual microprocessors that sit underneath. A computer program dictates how each pin moves, creating undulating wave motions and pushing pins up to create sandcastle-like structures to tell a sort of tangible narrative. A Kinect above can sense when someone is nearby, and as you run your hand above the pins, they shy away like a school of fish after you dip your hand in the water.

>The pins shy away like a school of fish after you dip your hand in the water.

An Evolving Project

If it looks familiar, that’s because the people responsible for Transform are the same people who created the astounding InFORM project. Last fall, when the Tangible Media Group released footage of InFORM, the internet’s head collectively exploded. In the video you watch as a human’s motions on screen are translated into a shape-shifting 3-D display, almost like a computer-assisted pin art toy.

It was truly bonkers, and not just because of how strange it looked. Cooler than the obvious visual appeal was the idea that someday we might actually use something like this to communicate with each other. InFORM was a first glimpse at a world where human-computer interaction has moved beyond flatscreens into the physical realm.

Though Transform moves similarly to InFORM, the projects actually have little in common. InFORM was essentially a way to make a computer interface exist tangibly, so the resulting project still very much looked and acted like a computer might. “Transform is going a little further,” explains Sean Follmer, one of the engineers on the project. “We’re saying, what could it mean to have physical interaction more imbedded in your home and in your life?”

Follmer and Leithinger believe computer-human interaction doesn’t have to look like a computer. In fact, they’re betting in the future technology will be so embedded into our surroundings that we’ll hardly notice it at all. “To me the most terrifying vision would be to be surrounded by touchscreens,” says Leithinger.

Beyond the Touchscreen

As our possessions become smarter and smarter, the question becomes less about if we can interact with these objects and more about how we want to interact with them. Touchscreens will simply be one of the many options–after all, swiping and tapping a flat, glassy screen isn’t a blanket solution to make something interactive.

“Materiality and tactility are fundamental human desires,” says Ishii. In world where we’re increasingly surrounded by flat pixels, Ishii’s lab is on a conquest to figure out how we can avoid a glass-covered future.

Transform is still very much a rough proof of concept, but the potential applications of this tangibility are easy to imagine: A piece of furniture that reacts to our mood or surroundings, a tangible architectural rendering, a new way to visualize topographic data, and that’s only a glimpse of what’s possible.

In the future all of our connected surroundings will have a richness that goes far beyond a flat screen, the team is betting. As Leithinger puts it: "Every little thing I have on me will be reacting to me in the future, and I don’t think only through pixels."