The concept of “deepfakes” hit the internet late last year like a weirdly rendered bombshell after amateur coders discovered that they could use AI to quickly and easily face-swap celebrities into pornographic clips. The phenomenon raised important questions about consent and revenge porn, but advocates for the technology have always maintained it can have non-harmful uses, too.

One of these is porn company Naughty America. This week, it launched a service that lets customers pay to customize adult clips to their liking using AI. They’ll be able to insert themselves into scenes alongside their favorite actor or actress or edit the background of an existing clip. “We see customization and personalization as the future,” Naughty America’s CEO, Andreas Hronopoulos, told Variety in an interview.

AI customization could be the future of porn

The company demoed the service with a pair of sample clips (link very much not safe for work). One blends the faces of two actresses and another swaps the background of a scene from a bedroom to a beach. It’s not the most advanced use of the technology, but the face-blending is relatively seamless, and it shows how accessible this sort of AI-powered video manipulation has become.

Customers who want to be inserted into a scene will have to send Naughty America a set of photos and videos of themselves, including different facial expressions that help the software accurately replicate their likeness. The company says its legal team will get consent from the actors involved. Simple edits will cost just a few hundred dollars, and longer, more complicated changes will run into the thousands.

There are a number of potential problems with this service. For example, how will Naughty America know that the photos and videos submitted by customers are also consensual? They could be taken under false pretenses and submitted to the site by a third-party. (The Verge has reached out to Naughty America with questions but has yet to hear back.)

It might also be important for the company to indelibly watermark the resulting videos, so they’re not confused with original pornographic clips. This is something that a number of companies and researchers are looking into, especially as experts worry that AI video editing will be used to create fake videos for the purpose of political manipulation.

But Naughty America is presenting this as a natural use of the technology. “It’s just editing, that’s all it is,” said Hronopoulos of the “deepfake” concept. “People have been editing people’s faces on pictures since the internet started.” He told FastCompany: “Deepfakes don’t hurt people, people using deepfakes hurt people.”