Child Rescue Coalition Partners with Pex to Enable Platforms to Monitor and Remove CSAM on Social Media

Apr 28, 2021

What would it take to stop the spread of child sexual abuse material (CSAM)? This is a question the Child Rescue Coalition (CRC) has been working to resolve for eight years. As a non-profit founded to protect children from sexual exploitation and abuse, the CRC built technology capable of hunting child predators and has aided in the arrest of more than 12,000 criminals, rescuing over 2,700 abused children. 

By working together, nonprofits like the CRC, law enforcement, and digital platforms can permanently block CSAM from being published and shared on social media.

After testing Pex’s Attribution Engine technology, the CRC determined the Attribution Engine is extremely effective at identifying CSAM within user-generated content. 

Pex, the market leader in digital rights technology, has spent the past 7 years building superior technology that can identify more content matches than any other solution. This technology is commonly utilized to find instances of copyright infringement online to drive proper attribution or payment to the rights holders. Fortunately, this technology can also be leveraged for societal good, such as identifying instances of CSAM or other toxic content within user-generated content (UGC) uploaded to social media sites. 

The partnership will allow Pex’s best-in-class technology to be leveraged by law enforcement, focused on preventing the publication and rapid spread of CSAM in UGC. The CRC tested Pex’s technology to determine if it would be effective at identifying CSAM, including altered versions of videos, which often slip through the cracks, and contribute to the proliferation of toxic content as well as tie up law enforcement’s resources.  

CRC’s Testing Method

To test the Attribution Engine, the CRC would edit a sample of UGC videos, and Pex would be responsible for identifying those changes. Here is a visual representation of that process:

“As internet use and user-generated content has proliferated worldwide, the circulation of child sexual abuse material – sometimes inaccurately referred to as ‘child pornography’ – has grown along with it,” said Glen Pounder, the Chief Operating Officer at the CRC. “In evaluating Pex’s technology, it was clear that it is significantly faster and more accurate than what we have utilized in the past. Time is of the essence to help these children.”

What would it take to stop the spread of child sexual abuse material (CSAM)? 

When it comes to stopping the spread of CSAM, digital platforms and law enforcement need technology that can identify all versions of a video, despite distortions, and do it in seconds, prior to the content being published online. Every second wasted could mean more sharing and spreading of this toxic content.

Pex can identify a positive match down to one second of content, and identifying toxic content in UGC takes only five seconds using Attribution Engine. Using a software development kit, platforms can “plug in” to Attribution Engine and pass all user uploads through the engine to identify any use of copyrighted material, CSAM, or other types of toxic content. Law enforcement can also utilize Attribution Engine to securely provide fingerprints of CSAM so it can be identified on platforms. 

Learn more  

Read more about the CRC test and proliferation of CSAM online in Pex’s whitepaper: Stopping CSAM in real time

Want to partner with Pex to help prevent CSAM?

Reach out on our contact page

Recent stories

Subscribe to our newsletter

Join our mailing list to receive the latest news and updates from our team.

You have Successfully Subscribed!