Artists are supposed to be among the least likely to lose their jobs to automation, but what happens when AI-enabled features start painting, editing, and doing other parts of their jobs for them? AI tools are already starting to automate what used to be time-consuming manual processes — but the results may be good for artists’ creativity, rather than potential job killers.

Companies that make industry-standard creative tools like Adobe and Celsys have been adding AI features to their digital art software in recent years in the hopes that it’ll speed up workflows by eliminating drudge work, and give artists more time to experiment. From machine learning tools that help find specific video frames faster, to features that color in entire works of line art with just a button, AI is being incorporated in subtle, but surprisingly impactful ways.

The best AI features can assist artists and cut out repetitive tasks, says Tatiana Mejia, who manages Adobe’s AI platform, Sensei. Her assessment comes from a Pfeiffer Consulting study commissioned by Adobe, in which most creatives said they weren’t worried about being replaced by AI, and that they could see the most potential for AI and machine learning applied to tedious, uncreative tasks. That could mean a smart cropping feature that automatically recognizes the subject of a photo, or automatic image tagging to help people find stock photos faster. They’ll still require an artist’s control, too. “Creativity is profoundly human,” Mejia says. “AI cannot replace the creative spark.”

Adobe makes a big splash every time it shows off concept AI tools, like content-aware fill for video, but it’s usually the more subtle AI features that actually ship. Recent additions include an automated audio mixing feature in Premiere and the ability to create searchable PDFs through optical character recognition in Adobe Scan. It’s a lot less flashy than AI automatically removing unwanted objects out of videos, but it’s enough to take some of the drudgery out of creative work.

There are more AI features in the works from Adobe that could have significant impacts on productivity behind the scenes. One feature can take a video of a dog jumping in a pool and generate descriptive tags; another can take a simple doodle of a mushroom and pull up similar-looking photographs, much like Google’s experimental Quick Draw and AutoDraw tools that use neural networks to recognize sketches.

Grid View Watching a video of a dog jumping in a pool, the AI generates descriptive tags like the object and action, and the associated confidence level. The timeline below the video shows when the action is occurring, and when the subject appears in the video so editors can find specific scenes faster. Adobe

Search by Sketch Adobe

Search by image Adobe

Other AI tools could have more dramatic implications for how artists work, like an auto-coloring tool designed for comics and animation. A beta version of Celsys’ manga and illustration software Clip Studio now includes an AI feature that, with just a little guidance from the artist, can automatically color in black-and-white line drawings. The results can be unpredictable and require a little cleanup, but there’s huge potential in the way the technology can be used by artists and studios.

AI coloring tools could play a big role in the future of 2D animation, says animator João Do Lago, who has worked on various anime including Netflix’s Castlevania. Do Lago says they could give artists room to experiment, by cutting down the time it takes to color each frame.

“One of the things that makes animation really hard to do is the amount of time it takes to create, which makes it so that most studios just stick to a visual style and formula that has been proven to work before,” Do Lago says. “But when you can automate a big part of that process, you allow more room for different ideas and visual languages to be explored, since you can iterate on them a lot faster.”

Some studios have already begun investing in auto-coloring research, including OLM, the production studio behind the Pokémon anime. And similar AI colorization tools have been around for a few years, including Japanese AI startup Preferred Networks’ browser-based tool PaintsChainer, which is already being used by some manga publishers, and Style2Paints, a web-based bot created by a team of research students from the Chinese University of Hong Kong and Soochow University.

Another test using the Colorize tech on one of my animations. This time with some manual tweeking to try to achieve more controled results. It's very limited tech, but the results are still impressive. No doubt A.I's will play a huge role on the future look of 2d animation. pic.twitter.com/e8mEdlI0Wg — Joao (@JonnyDoLake) December 18, 2018

Celsys says its AI is taught with sets of line art extracted from color illustrations and colored images. The technology is based on a deep learning method, which combines computer vision tools, like those used in self-driving cars, with visual creation tools, like the one Nvidia used to create extremely realistic AI-generated faces. It’s worth noting that the company is also using artists’ uploaded data to improve the technology, but Celsys says artists will retain copyright for the uploaded and generated image, and the image data will never be released.

Celsys is optimistic that the tools it’s providing are in artists’ best interests, rather than technologies meant to replace them. “We think that AI features are just one type of feature within a digital art tool,” a Clip Studio spokesperson said. “Our hope is that creators can easily use these tools in the way that’s best for them.”

And while AI features might remove some work, Mejia at Adobe thinks it’ll just leave room for more and better work. “There may be something that used to be manually done that will be automated,” she says, “but it’ll be more than offset by the increased demand in terms of quantity and quality of work that’s expected.”