Technology

Google Launches AI Solution to Spot Child Sex Abuse Content

Noah_Loverbear / Wikimedia Commons

Among the most heinous of online activities, child sex abuse content has to be at or near the top. Alphabet Inc.’s (NASDAQ: GOOGL) Google has released an artificial intelligence solution to help combat it. While the fix is not complete, it does offer help to stop the overwhelming and serious problem.

Google Europe’s Nikola Todorovic, Engineering Lead, and Abhi Chaudhuri, Product Manager, wrote in a blog post:

Using the internet as a means to spread content that sexually exploits children is one of the worst abuses imaginable. That’s why since the early 2000s we’ve been investing in technology, teams, and working closely with expert organizations, like the Internet Watch Foundation, to fight the spread of child sexual abuse material (CSAM) online. There are also many other organizations of all sizes that are deeply committed to this fight—from civil society groups and specialist NGOs [non-government organizations] to other technology companies—and we all work to ensure we share the latest technological advancements.

Today we’re introducing the next step in this fight: cutting-edge artificial intelligence (AI) that significantly advances our existing technologies to dramatically improve how service providers, NGOs, and other technology companies review this content at scale. By using deep neural networks for image processing, we can now assist reviewers sorting through many images by prioritizing the most likely CSAM content for review. While historical approaches to finding this content have relied exclusively on matching against hashes of known CSAM, the classifier keeps up with offenders by also targeting content that has not been previously confirmed as CSAM. Quick identification of new images means that children who are being sexually abused today are much more likely to be identified and protected from further abuse.

Because the program still requires human review, it cannot stamp out the child sexual abuse problem, but it can certainly help.

 

Thank you for reading! Have some feedback for us?
Contact the 24/7 Wall St. editorial team.