Google Chairman Eric Schmidt today said his company's search engine has made it far more difficult to find images of child pornography and that Google is developing a technology that will identify children being abused in YouTube videos.

"While no algorithm is perfect—and Google cannot prevent pedophiles adding new images to the Web—these changes have cleaned up the results for over 100,000 queries that might be related to the sexual abuse of kids," Schmidt wrote in the UK-based Daily Mail.

UK Prime Minister David Cameron recently called on search engines to impose a blacklist of search terms related to child sexual abuse. Microsoft's Bing was "the first to introduce pop-up warnings for people in the UK who seek out online images of child abuse" back in July, a BBC story at the time noted. The UK is also forcing Internet service providers to roll out filters targeting all porn.

It was also reported by The Guardian today that "GCHQ [the British intelligence agency] will be brought in to tackle the problem of child abuse material being shared on peer-to-peer networks."

Google's changes aren't just for the UK. "[W]e will soon roll out these changes in more than 150 languages, so the impact will be truly global," Schmidt wrote. In addition to the 100,000 search queries mentioned above, Google is now showing warnings for more than 13,000 queries. "These alerts make clear that child sexual abuse is illegal and offer advice on where to get help," Schmidt wrote.

Schmidt noted that "Google and Microsoft have been working with law enforcement for years to stop pedophiles sharing illegal pictures on the Web." In this case, Schmidt credited Microsoft for sharing its picture detection technology with Google to help its rival better identify pictures of children being sexually abused.

To prevent false positives, Google has employees reviewing photos before blocking them. "This is because computers can't reliably distinguish between innocent pictures of kids at bathtime and genuine abuse. So we always need to have a person review the images. Once that is done—and we know the pictures are illegal—each image is given a unique digital fingerprint," Schmidt wrote.

Google also has a plan to target child pornography videos. "[P]edophiles are increasingly filming their crimes. So our engineers at YouTube have created a new technology to identify these videos," he wrote. "We're already testing it at Google, and in the new year we hope to make it available to other Internet companies and child safety organisations."