(CNN) Researchers at the Massachusetts Institute of Technology are teaching robots to "see" what an object looks like just by touching it and predict what something will feel like by looking at it.

A team from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) have created a predictive artificial intelligence to help robots use multi-modal sensory inputs.

It's something people do all the time.

If you look at a fuzzy sweater and a barbell, you'll be able to tell that one is soft and the other is hard. Or if someone hands you a hammer, you'll be able to get a pretty good idea of what it looks like, even if you're blindfolded.

MIT researchers used video and data from a sophisticated touch sensor on a robot arm to predict what an object would feel like by looking at it.

Robots can be equipped with visual and touch sensors, but it's tougher for them to combine that information.

Read More