With increasingly innovative video cameras now a nearly ever-present part of every facet of our lives, the tools to help us better utilize the resulting footage are gradually evolving, too.

Keying in on this trend of social cameras, that is, wearable cameras like the GoPro, Disney Research has devised an ingenious tool that can automatically edit the final footage together with an intuitive, human-style editing approach.

See also: Disney Research Develops Tactile 3D Touchscreen

Revealed in a research paper titled "Automatic Editing of Footage from Multiple Social Cameras," the research team has created software that splices together footage shot on different devices with results that should make at least some human video editors nervous. By estimating the 3D motion of each camera, the software is able to lock in on what the cameras are paying the most attention to and thereby correctly determine when and where to make a cut to another camera. Additionally, the algorithm can determine which camera has the best view of the subject and switch to it, effectively mirroring the subtle decision-making process of a human video editor.

Image: Disney Research

So while the final footage still looks shaky and disjointed (see video above), the automated video editor does an amazing job of seamlessly pulling the viewer's eye along by maintaining focus on the main subject in the video.

If any of this sounds familiar, it might be because you've seen VyClone's automatic video editing at work. However, Disney's researchers decided to put their system up against the other to show how VyClone's random jump cut editing compares to the more focus-on-subject style of their solution.

Looking at the footage side by side demonstrates how much more powerful Disney's solution could be for those looking to truly construct documentary footage as opposed to random video mixes composed of footage shot at the same location.

But the real power of the system is on display when the researchers play footage edited by Disney's system next to that of footage edited by a human video editor. Although both streams feature somewhat jerky footage, with both video streams unlabeled it really is hard to tell which is automated and which is human-edited.

By deliberately avoiding jump cuts and observing the "180 degree rule" (keeping the camera on one side of the subject, without changing the right-to-left relationship of the subject to the camera) the software successfully mimics the edits a human might make to maintain the visual continuity of a scene.

"The resulting videos might not have the same narrative or technical complexity that a human editor could achieve," admitted Ariel Shamir, a member of Disney Research Pittsburgh, in a statement on the team's website, "but they capture the essential action and, in our experiments, were often similar in spirit to those produced by professionals."

Although the software isn't commercial just yet, the researchers believe that in the future, rather than replace human video editors, the system could become a tool used to shorten the post-production time on major video projects.