Unlike the conventional approach, which simply compares image hashes against known offending images, the AI method can also flag previously undiscovered material. That, in turn, could help authorities catch active offenders and prevent further abuse.

The tool is free to both corporate partners and non-governmental organizations through Google's Content Safety programming kit. While there's no certainty that it'll dramatically reduce the volume of horrible images online, it could help outlets detect and report child sex abuse even if they have only limited resources.