Artificial intelligence can go beyond recognizing a face; the latest technology also tries to read facial expressions to assess what the subject is feeling. Such systems are relatively early in their development—and prone to serious errors—but the idea of someday seeing them in widespread use still worried computer programmer and filmmaker Noah Levenson. “We’re giving all these corporations unrestricted access to our faces all the time,” he says. “We’re using Snapchat to put these filters on our faces. We’re taking selfies and posting them on Instagram, and we’re using apps like FaceTime to communicate with each other.”

Levenson decided to base his next film project on this idea. “I thought, given all this access we’ve given them to our faces, what is the most hypothetically awful thing they could possibly be doing to harm us?” he says. He imagined companies potentially analyzing users’ photographs to determine their emotions in each one, then combining this with other data to draw sweeping conclusions about people’s thoughts and preferences. “What if they correlated [the emotional analysis] with other inputs: your location via GPS or maybe the content you’re consuming,” he says. “They would know something about the way you felt about some place, some thing—or maybe even someone, if they examine your text history.” Then the tech giants could turn around and sell these data about your behaviors and opinions to other parties.

So Levenson began researching emotion recognition technology—and quickly realized its use might not be entirely hypothetical. “When I started googling around, I discovered the patents revealing that the tech companies had kind of beat me to it,” he says. “They had clearly been thinking about precisely the same thing for at least like five years now.” He incorporated this research into his new “interactive documentary.”

When a person watches Stealing Ur Feelings (available here) online, the viewer’s own face—frequently festooned with virtual sunglasses or other digital stickers—appears onscreen at the same time as stock imagery such as cute animals, pizza and random humans. These visuals speed by as a narrator cheerily discusses how emotion recognition works and how tech companies could use it. To demonstrate what this technology can do, the video applies it to viewers: Levenson adapted open-source code into a program that uses a device’s Web cam to analyze the viewer’s face as it reacts to the other on-screen imagery.

In other words, as you watch this video, it watches you right back. “I think it’s a very interesting way of not just talking about these abilities but actually showcasing them,” says privacy researcher Florian Schaub, an assistant professor at the School of Information at the University of Michigan, who was not involved in Levenson’s new project. Based on the viewer’s reactions, the documentary draws real-time conclusions. These range from the obvious (how much viewers tend to look at their own face and their average range of expressions) to the creepy (how much they like pizza or puppies—and whether they prefer looking at white people or black people) to the really out-there (their estimated income and IQ score).

Levenson says he coded his six-minute video to work in a Web browser and avoid saving information, so the data it picks up will disappear when one closes the window. He doubts, however, that tech companies would be so quick to erase their findings. And he is not the only one worried about this possibility. The nonprofit browser developer Mozilla supported Stealing Ur Feelings with a $50,000 “Creative Media Award” after Levenson submitted a description and rough draft of the project. Mozilla is also sending a petition to Snapchat, which has filed an emotion-recognition patent, asking the company to share whether it is already applying this technology to users’ photographs and videos. (At the end of the documentary, viewers can smile on cue in order to add their signature to the petition.)

When reached for comment, a representative of Snap, the company that owns Snapchat, listed some of its privacy principles and policies but declined to share a public response. Yet Schaub says some tech companies’ terms of service do not necessarily reveal whether they are using emotion recognition or not. According to him, privacy policies might mention that a company gathers user data to improve its services—but these explanations rarely provide specific details on the exact information the companies collect and how they monetize it. “If they use it to show you specific ads, that’s maybe one thing that’s a bit creepy,” he says. “But I think the more concerning part is if they allow other people to target you based on your mental state or sell this information to data brokers.” After all, this type of information exchange already exists. “When you go to a Web site or use a mobile app, there’re often 10, 20 different third parties tracking what you’re doing,” Schaub says. “There’s this whole data economy with lots of shadow companies and data brokers you’ve never heard of, don’t have a relationship with, that end up collecting or receiving a lot of data about you.”