latentbrief
Back to news
General1h ago

Santa Barbara Prosecutes First Case Under AI-CSAM Law

The Santa Barbara Independent1 min brief

In brief

  • Santa Barbara authorities have made history by prosecuting Dayton Aldrich under California's new law targeting AI-generated child sexual abuse material (CSAM).
  • The case marks the first on the Central Coast to use Assembly Bill 1831, which criminalizes such content.
  • Inspired by deepfake technology and "nudify" apps that create realistic images, this law plugs a legal gap by treating AI-generated CSAM like real abuse materials.
  • The National Center for Missing and Exploited Children (NCMEC) has seen a surge in reports of AI-CSAM, jumping from 4,700 in 2023 to 1.5 million in 2025.
  • Investigators linked Aldrich to explicit chats on Kik, where he shared "unusual interest" in minors.
  • They found multiple CSAM images, including those of a former child actress and a TikTok personality.
  • Aldrich, once a victim program assistant, faced severe penalties but pleaded guilty to one charge, earning a year in jail and two years' probation.
  • His arrest also revealed his possession of over 20 guns, highlighting the broader societal risks.
    • This case underscores the urgent need to combat AI-CSAM and protect vulnerable youth.

Terms in this brief

CSAM
Child Sexual Abuse Material — images or videos that exploit children inappropriately. This term is used to describe illegal content that authorities work to identify and remove from the internet.
Assembly Bill 1831
A California law that makes it illegal to create, distribute, or possess AI-generated child sexual abuse material. It treats such content similarly to real abuse materials, aiming to protect children from exploitation.

Read full story at The Santa Barbara Independent

More briefs