Two years ago, at Adobe Max 2011, company engineers showed off a prototype de-blur feature that the company was working on. Even in its early stages, the technology attracted a great deal of interest from consumers, many of whom were frustrated when Adobe stated that the filter wouldn’t be ready in time for CS6. It’s taken a great deal of additional work to get the filter ready, but the company now plans to demo it as part of Adobe Max 2013 — this time as an upcoming feature rather than a prototype concept.

This brings up a rather interesting question. Blur is one of the most common problems with photography, and it can happen to anyone. Shooting with a stable tripod avoids the problem, but there’s a serious dearth of tripods when shooting images on the fly with a smartphone. For such a ubiquitous problem, you’d think Photoshop would’ve implemented a filter long ago. Indeed, while there are third-party filters you can buy that offer various de-blurring options, Photoshop doesn’t have an integrated solution.

Why not? Because de-blur is hard. Blur can be introduced by focus errors or because an object is moving or rotating. Mathematically modeling the problem and applying the appropriate corrective action often means knowing something about the direction of movement (if any) and the types of errors that were introduced as a result. There are a huge number of potential reasons why an image may look blurry, and correcting them requires underlying knowledge about how the photo was taken. Incorporating that data into a one-click filter is extremely difficult.

Will the feature be cloud-only?

Adobe has released a video of their new camera motion compensation technology in action, and it’s an impressive piece of work.

There’s a twist, however. If Adobe holds true to its plans from last fall, this new de-blur capability will only be available to Adobe Creative Cloud customers. In an interview late last October, Adobe CEO Shantanu Narayen had this to say:

There is some magic that these people are working on. You can take a picture that was actually completely fuzzy – and they analyze the camera’s motion, and based on that they can actually sharpen the picture, including seeing text that was not clear. And so what we will do, is that increasingly all of those things will be available through a cloud service – and only through a cloud service. (Emphasis added)

Exactly how much that would cost remains unclear. On Adobe’s website, a subscription to Creative Cloud comes out to $49/month, while NewEgg sells a PhotoShop CS6 12-month subscription for $239. The latter is considerably more palatable than the former if you’re a casual user, but there’s no information on whether or not you can still get access to the new features if you simply sign up for one Adobe product as opposed to the full Creative Cloud.

This is something Adobe needs to clarify. A monthly subscription process isn’t going to work for everyone, and hamstringing which products get which features in an attempt to force people to switch may backfire enormously. Still, the underlying technology is great to see — hopefully we’ll be able to correct a wide range of imperfect image via a straightforward process.

Now read: Adobe shows off Content Aware Move