Sunday, Jan 20, 2019 | Last Update : 12:19 PM IST
Experts and engineers are taking part in an industry “hackathon” to collaborate and create new ways to tackle child sexual abuse online.
We can all agree that content that exploits or endangers children is abhorrent and unacceptable. Google has a zero tolerance approach to child sexual abuse material (CSAM) and are committed to stopping any attempt to use our platforms to spread this kind of abuse.
So this week, experts and engineers are taking part in an industry “hackathon” where technology companies and NGOs are coming together to collaborate and create new ways to tackle child sexual abuse online. This hackathon marks the latest milestone in our effort to fight this issue through technology, teams and partnerships over two decades.
In 2006, Google joined the Technology Coalition, partnering with other technology companies on technical solutions to tackle the proliferation of images of child exploitation. Since then, they have developed and shared new technologies to help organisations globally root out and stop child abuse material being shared.
In 2008, they uses “hashes,” or unique digital fingerprints, to identify, remove and report copies of known images automatically, without humans having to review them again. In addition to receiving hashes from organisations like the Internet Watch Foundation and the National Center for Missing and Exploited Children, they also add hashes of newly discovered content to a shared industry database so that other organisations can collaborate on detecting and removing these images.
In 2013, changes were made to the Google Search algorithm to further prevent images, videos and links to child abuse material from appearing in our search results. They’ve implemented this change around the world in 40 languages. We’ve launched deterrence campaigns, including a partnership with the Lucy Faithfull Foundation in the UK, to show warning messages in response to search terms associated with child sexual abuse terms. As a result of these efforts, they have seen a thirteen-fold reduction in the number of child sexual abuse image-related queries in Google Search.
In 2015, we expanded our work on hashes by introducing first-of-its-kind fingerprinting and matching technology for videos on YouTube, to scan and identify uploaded videos that contain known child sexual abuse material. This technology, CSAI Match, is unique in its resistance to manipulation and obfuscation of content, and it dramatically increases the number of violative videos that can be detected compared to previous methods. As with many of the new technologies we develop to tackle this kind of harm, we shared this technology with industry free of charge.
This work has been effective in stopping the spread of known CSAM content online over the years. In 2018, new AI technology was announced which steps up the fight against abusers by identifying potential new CSAM content for the first time. The new image classifier assists human reviewers sorting through images by prioritising the most likely CSAM content for review. It already enables them to find and report almost 100 per cent more CSAM than was possible using hash matching alone, and helps reviewers to find CSAM content seven times faster.
Since they have made the new technology available for free via the Content Safety API in September, more than 200 organisations have requested to access it to support their work to protect children. Identifying and removing new images more quickly—often before they have even been viewed—means children who are being sexually abused today are more likely to be identified and protected from further abuse. It also reduces the toll on reviewers by requiring fewer people to be exposed to CSAM content.
Because this kind of abuse can manifest through text as well as images, they recently made substantial changes to tackle predatory behaviour in YouTube comments using a classifier, which surfaces for review inappropriate sexual or predatory comments on videos featuring minors. This has led to a significant reduction in violative comments this year.
Underpinning all of this work is a deep collaboration with partners. As well as the Technology Coalition, Google is member of the Internet Watch Foundation and the WePROTECT Global Alliance, and they report any CSAM content we find to the National Center for Missing and Exploited Children who in turn report to law enforcement.