The city of West Sacramento, California, is using a controversial Artificial Intelligence program that tracks and analyzes what people are posting about the city online, KOVR-TV reported.

City leaders promoted the practice as a way to better understand residents' concerns and what is “trending.” But some people are questioning the scope of the program, called Zencity, and how the data might be used.

How does it work?

Zencity is a system that crawls through public posts on social media sites like Facebook, Twitter, and Instagram. Then, the system sorts through the posts and classifies the information as either positive or negative.

The company's website explains:

ZenCity is a platform for understanding people in the city on a wide scale. With the use of advanced AI algorithms, we collect and analyze hundreds of thousands of interactions from social media, city hotlines and many other sources, and provide different stakeholders with detailed insights about how their citizens view and use the city — in real time.

Mayor Christopher Cabaldon told the TV station that city began using the program following a rash of mailbox thefts last year.

“We saw the thing that most people were talking about were mailbox thefts,” Cabaldon said. “That’s something that we might not have noticed just by waiting for people to come to city hall or filing a complaint.”

“The purpose of Zencity is to see the big picture,” he added.

Other hot topics in the city have included the closing of a Safeway store and a father’s alleged murder of his two daughters in January. In March, residents complained about the lack of updates on a shooting threat at River City High School, the report states.

“It’s not that Zencity replaces our other forms of civic engagement, its just a way to listen more,” Cabaldon told the TV station.

What about privacy concerns?

Privacy advocates are suspicious, especially in light of the recent Cambridge Analytica scandal that extracted data from tens of millions of Facebook users.

“There are ways this could go wrong,” said Peter Eckersley of the Electronic Frontier Foundation, a privacy and digital rights nonprofit group. “Once you get into policing there are many more potential concerns around the use of artificial intelligence.”

Cabaldon said there are no privacy issues because only publicly available data is used.

“It allows us to hear the whole community and not just the loudest voices that come to our chambers for a public hearing,” Cabaldon told KOVR.

The city paid $12,000 to license the program for one year, which includes a 66 percent early-user discount, according to the report.