Threat Category

Child Sexual Abuse Material (CSAM)

Produced in conjunction with law enforcement this category detects
pornographic content that may be illegal.

While law enforcement has used checksum database technology in the past to detect known Child Sexual Abuse Material (CSAM)
images, this new AI based CSAM detection module is capable of finding and highlighting previously unseen and hence unknown
material to law enforcement. This allows investigators to quickly identify recently generated material and identify potential new victims.

Threat Category: Child Sexual Abuse Material (CSAM)

Image Analyzer’s CSAM category is designed to identify images and videos containing: Child Sexual Abuse Material