According to a report by TechCrunch and online safety firm AntiToxin Technologies, Microsoft’s Bing search engine surfaces images of child sexual abuse, and even suggests additional keywords in some cases.

AntiToxin conducted its research between December 30 and January 7 with SafeSearch turned off. During that time, the researchers found that when people searched for certain terms related to child sexual abuse, the search engine returned images depicting this form of abuse, and even suggested more associated terms – both in results and in automatic suggestions in the search bar.

Microsoft chief vice president of Bing and AI products, Jordi Ribas, told the publication that these results were unacceptable and the company has now fixed the problem:

We acted immediately to remove them, but we also want to prevent any other similar violations in the future. We’re focused on learning from this so we can make any other improvements needed.

While Microsoft claims it has fixed the issue and working on blocking more queries related to images of child abuse, AntiToxin researchers found that results from some queries from the report were still accessible. The company noted that it’s working on adding terms like “child sexual abuse” to its content flagging categories.

AntiToxin CEO Zohar Levkovitz told TechCrunch that tech platforms should double down on their efforts to prevent such content from being distributed on their platforms.

Sadly, this is not a remote incident of tech companies failing to quarantine the problem of searches related to images of child abuse. Earlier this week, a report suggested that bad actors were spreading abusive images through Dropbox links using specific Instagram hashtags.

It’s odd to see Microsoft dropping the ball on this front, particularly since the company has been working on numerous efforts to tackle the spread of images depicting child sexual abuse on the web. That said, it looks like the firm needs to work harder to fix Bing, which was previously found surfacing offensive material as recently as three months ago.

Hopefully, this finding will remind tech firms to focus on cleaning up their act and improving the hygiene of their platforms in 2019.

You can read AntiToxin’s full report here.

Read next: Don’t overestimate AI’s understanding of human language