Today Google are announcing the release of a new tool which they are making available at no cost to relevant NGOs and industry partners. Here’s the problem it is addressing.
Microsoft’s Photo DNA is brilliant. It works at scale to find images that have already been identified and determined to be CSAM and then have made it into a database that is used to scan repositories of images looking for matches. Matches can rapidly be deleted as steps are taken to locate the victim and bring the perpetrators to book. Photo DNA will continue to be an essential element in the ongoing fight to rid the internet of these images and help the child victims.
But what about those images that have not yet been found and classified? Any new image that is located is likely to be linked to current or recent sexual abuse, abuse that may still be ongoing. This is where Google’s new Content Safety API can step in. It deploys AI to sift through, identify and classify images which are most likely to contain child sex abuse material. From the millions of images residing on a server it can flag and prioritize those that need to be looked at. This is a huge step forward which will complement Photo DNA and greatly enhance the global effort to make the internet a better and safer place for children.
Well done Google. Three gold stars.