Child Sexual Abuse Material (CSAM)

While Child Sexual Abuse Material (CSAM) is illegal content, please note that this page does not constitute legal advice.

CSAM is a sensitive topic that may provoke a strong emotional response from readers. This page explains the importance of being proactive about the detection, removal, and reporting of CSAM.

You should make it a priority to prevent your platform from becoming an unwilling host to CSAM. To do that, you need to learn what to look for and how best to moderate it.

What Is Child Sexual Abuse Material (CSAM)?

CSAM is any image or video that shows a minor subjected to sexual activity, which is abuse. Bad actors will sometimes use online chat rooms, live video streams, and other online community forums to post illegal content.

How To Proactively Moderate Child Sexual Abuse Material

There are several things you can do to be more proactive about the detection, removal, reporting, and even prevention of CSAM.

Create a Plan To Identify & Remove CSAM Quickly

Establish a process for content moderators to identify and remove CSAM as quickly as possible. This involves either hiring or designating someone to do content moderation for your platform. Your content moderators will use manual or automated moderation practices to identify CSAM.

To create an effective plan, you'll need to educate and involve multiple departments.

  • Your product team should help you identify the tools and methods you'll use to filter through your product and identify CSAM. 

  • Your legal team or a legal consultant can provide information on what to do if you identify CSAM and legally what needs to be done to remove and report this content. 

  • Your communications or PR team should be able to speak about the steps you take to moderate CSAM and have a communication plan for both internal and external parties if any CSAM is published on your platform.

Provide CSAM Training to All Content Moderators and Other Stakeholders

Anyone who is responsible for moderating content should be trained on what CSAM is and how to moderate it. They must be familiar with your internal policies and guidelines, as well as any external laws and regulations.

Train every person who has the responsibility of identifying and removing CSAM material so they know how to spot it, how to remove it, and what steps to take to report it to the appropriate authorities.

Create step-by-step instructions or a reference guide they can use every day to make sure they always accurately moderate for CSAM. The instructions will depend on the type of tool you use to moderate and will be specific to your platform and how it functions on the back end. 

Implement Automated Moderation 

Implementing automated moderation helps in a few ways. It sorts through and removes content faster than a human can, and it spares a person from needing to see and identify potentially traumatic content.

Reports from The Verge and BBC indicate that manual content moderation is known to give human moderators PTSD. That said, while moderating harmful and illegal content has to be done, it doesn't have to be done by a person.

Automated moderation tools use AI to help you find, monitor, and resolve harmful content with minimal human intervention. With some tools, moderators can set up filters that flag potential CSAM material so that this content is automatically detected and prevented from being published.

CSAM in the News

Major tech players have faced their own issues with CSAM.

Last year, Apple publicly announced it would be scanning all iCloud photos for CSAM but drew widespread criticism for concerns around the security and privacy of all users and eventually abandoned its plan. Critics of the plan were worried about the surveillance capabilities of this technology and how it could potentially be abused.

While the company didn't move forward with its original plan, Apple still took steps to prevent the distribution of CSAM by putting the responsibility of moderation into the hands of parents and caregivers. Parents and caregivers can now send warnings and/or resources "to children if they receive or attempt to send photos that contain nudity."

Google recently attempted a similar plan to Apple, where the platform automatically monitored photos and videos uploaded to its platform. However, Google marked parents' photos of their own children (in the bath, for example) as CSAM, and then those parents lost access to their entire Google accounts and passwords.

Take CSAM Moderation Seriously — Be Proactive

CSAM is a problem you should consider seriously because it has severe consequences for you and your users. Put a concrete plan in place that anyone involved with moderation on your platform can reference. Make sure everyone is trained on CSAM moderation, including how to use any automated moderation tools your organization employs.

Next Steps

Start by opening an account and trying out our products. We’re here to help you understand the best solution to your use case. Contact us any time to learn more about Stream.

Chat Messaging

Build any kind of chat messaging experience without scalability or reliability issues.

Learn more about $ Chat Messaging

Activity Feeds

Build any kind of feed without the headache of scalability or reliability of your feeds.

Learn more about $ Activity Feeds