DigInfo TV screencap There are so many scintillating technologies in the works one imagines looking back on James Cameron’s Avatar as almost quaint. As absurd as that sounds, looking around the technology space is like looking into a future that would have seemed nearly impossibly only a decade ago. With the truly mind blowing speed with which the internet, smart phones and digital cameras have increased in functionality and ubiquity, so to has the ways in which you can shoot, and distribute. New camera technologies and new tools to watch their work on just about any screen you have are popping up all the time.

Here’s a quick look around the world at some emerging technologies:

Dynamic Target Tracking Camera

Well this is just wild. The University of Tokyo has created a camera that can capture high-speed flying objects and center them on the screen at all times. Imagine what a director like, say, Danny Boyle or Kathryn Bigelow (arguably the greatest action director working) could do with a camera like this? As Hiromasa Oku, assistant professor at the University of Tokyo explains, typically to change the direction a camera faces you have to manually or mechanically move the camera itself, but with this prototype, it’s not the camera that moves but the mirrors. In the demonstration above, the camera is able to track, and center, the tennis ball despite it moving extremely fast. The mirrors that allow the camera to track the tennis ball can move at high speeds on the order of milliseconds.

IllumiRoom

This proof-of-concept system from Microsoft Research will blur the lines between what’s happening on your TV screen and the rest of your living room, immersing you in a gaming experience like no other. Using a Kinect for Windows camera and a projector, the IllumiRoom will combine the virtual and physical world by changing the appearance of your room, inducing apparent motion, extending the field of view, or enabling new gaming experiences. If you’ve ever played a video game and wished your entire room was the gaming environment, well, you’re in luck.

M-Go

This joint venture between DreamWorks Animation and Technicolor has deals with major studios to allow them to stream films on the same day and date that they become available on Blu-ray and DVD. Now M-Go is going to be available on 2012 and 2013 LG Smart TVs, which will allow people to stream new video releases right on their TVs, in addition to its app, which works on Samsung and Vizio TVs, tablets, and Blu-ray players. This is just another solid addition in a market that has plenty of great streaming opportunities, as Hollywood continues to make high-quality streaming possible, and legal, protecting the hard work of artists while freeing film lovers to see the latest releases where they want, when they want.

More Realistic Simulated Cloth

Of the many challenges that animators for films and video games are faced with, creating clothes that look right is every bit as hard as getting their facial expressions or movements down. Rendering cloth has long been a problem for filmmakers. Henrik Wann Jensen, a Ph.D. advisor to Iman Sadeghi, the developer of a new model for rendering realistic cloth, said that cloth in movies often looks wrong. “This model is the first practical way of controlling the appearance of most types of cloth in a realistic way,” he said in a press release. That new model is the work of computer scientists at the University of California, San Diego, who have developed it with ‘unprecedented accuracy’ on the way cloth and light interact. Developed by Sadeghi, the model is based on a new approach that simulates the interaction of cloth and light by simulating how each thread scatters light. ”It essentially treats the fabric as a mesh of interwoven microcylinders, which scatter light the same way as hair, but are oriented at 90 degrees from each other,” Sadeghi said in press release. Sadeghi is not new to the film world—he is an expert on the subject of simulating lighting interacting with hair. While he was a Ph.D. student he developed a model that was later used in Disney’s Tangled, in which Rapunzel had 70-feet of simulated blonde hair. This new model for cloth simulation can not only be used for existing fabrics, but can also “act as a framework to visualize what new fabrics would look like,” Oleg Bisker, a co-author on the paper, said in a press release. “We can simulate any combination of weaving pattern and thread types.”

Creating 3D Images Through a Single Lens (Without Moving the Camera)

Leave it to the brains at Harvard’s School of Engineering and Applied Sciences to devise a way for photographers and microscopists to create 3D images through a single, stationary lens. The technology these researchers have created relies on computation and mathematics for the counter-intuitive ability of seeing a stereo image with, essentially, one eye-closed. It is nearly impossible for a single eye to get a sense of depth perception, but the Harvard team, utilizing new hardware-microlens arrays and absorbing masks that record the direction of light, use the angle of light on each pixel to approximate that angle on every pixel, focusing the camera at different depths and using the information from the slight differences between these shots to create brand-new images as if the camera had been moved to one side. The Harvard team’s research is aimed at creating a way to create a stereo image without the need for expensive hardware, and this technology could also create an accessible way to create 3D images of translucent materials, such as biological tissues.