Technology17 days ago2 min read

Teens sue Elon Musk’s xAI over Grok’s AI-generated CSAM

TV

Byline

The Verge

Technology Correspondent

Covers technology developments with editorial context for decision-focused readers.

Teens sue Elon Musk’s xAI over Grok’s AI-generated CSAM
Image source: The Verge

Why it matters

The plaintiffs include two minors and an adult who was underage when the events in the lawsuit took place.

Key takeaways

  • One of the victims, identified as “Jane Doe 1,” alleges that last December, she learned that explicit, AI-generated images of herself and at least 18 other minors were available on Discord.
  • These are children whose school photographs and family pictures were turned into child sexual abuse material by a billion-dollar company’s AI tool and then traded among predators,” one of the victims’ lawyers, Annika K.
  • At least five of these files, one video and four images, depicted her actual face and body in settings with which she was familiar, but morphed into sexually explicit poses,” the lawsuit claims.The perpetrator, who has since been arrested, allegedly used Jane Doe 1’s AI-generated CSAM “as a bartering tool in Telegram group chats with hundreds of other users, trading her CSAM files for sexually explicit content of other minors.” The lawsuit claims the perpetrator generated the explicit images of Jane Doe 1 and the two other victims using Grok.

The plaintiffs include two minors and an adult who was underage when the events in the lawsuit took place. One of the victims, identified as “Jane Doe 1,” alleges that last December, she learned that explicit, AI-generated images of herself and at least 18 other minors were available on Discord. “At least five of these files, one video and four images, depicted her actual face and body in settings with which she was familiar, but morphed into sexually explicit poses,” the lawsuit claims.

The perpetrator, who has since been arrested, allegedly used Jane Doe 1’s AI-generated CSAM “as a bartering tool in Telegram group chats with hundreds of other users, trading her CSAM files for sexually explicit content of other minors.” The lawsuit claims the perpetrator generated the explicit images of Jane Doe 1 and the two other victims using Grok. It also alleges that xAI “failed to test the safety of the features it developed” and that Grok is “defective in design.”

Though X has tried making it harder for users to edit images with Grok, The Verge has found that it’s still possible to manipulate images uploaded to the platform. X has maintained that “anyone using or prompting Grok to make illegal content will suffer the same consequences as if they upload illegal content.” X didn’t immediately respond to The Verge’s request for comment.

“These are children whose school photographs and family pictures were turned into child sexual abuse material by a billion-dollar company’s AI tool and then traded among predators,” one of the victims’ lawyers, Annika K. Martin of Lieff Cabraser, said in a statement. “We intend to hold xAI accountable for every child they harmed in this way.”

The lawsuit seeks damages for victims impacted by Grok’s “illegal images.” It also asks the court to prevent xAI from generating and spreading alleged AI-generated CSAM.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

    The VergeVerified

    Curated by James Chen

    Sources & Further Reading

    Key references used for verification and additional context.

    Verification

    Grade D1 unique evidence links

    Publisher: The Verge

    Source tier: Tier 2

    Editorial standards: Our process

    Corrections: Report an issue

    Published: Mar 16, 2026

    Read time: 2 min

    Category: Technology