Forensic Category

Child Sexual Abuse Material (CSAM)

Image Analyzer’s Child Sexual Abuse Material (CSAM) module was developed in collaboration with law enforcement agencies. This category detects pornographic content that may be illegal.

In the past, law enforcement agencies used checksum database technology to detect known CSAM images. Our artificial intelligence based CSAM detection module is capable of finding and highlighting previously unseen, and hence unknown, material. This allows investigators to quickly identify recently generated material and identify potential new victims.

Child abuse image scanning software.

Forensic Category: Child Sexual Abuse Material (CSAM)

Child Sexual Abuse Material (CSAM) Child Sexual Abuse Material (CSAM) has different legal definitions in different countries. The minimum defines CSAM as imagery or videos which show a person who is a child and engaged in or is depicted as being engaged in explicit sexual activity.

Image Analyzer’s CSAM category identifies images and videos showing a child engaged or depicted as being engaging explicit sexual activity.

Contact us now to request a demo and a free trial!