A 17‑year‑old teenager from New Jersey has filed a lawsuit against the company behind the AI tool ClothOff, alleging that the app was used to generate non‑consensual nude images of her when she was 14. The tool’s operator, AI/Robotics Venture Strategy 3 Ltd., is being sued along with messaging platform Telegram for facilitating access to the software.
According to the complaint, the student’s social media photo in a swimsuit was altered using ClothOff to produce a realistic nude image. That image was then circulated among classmates, identifiable by face despite the non‑consensual modification. The lawsuit contends that because the victim was a minor at the time, the altered image qualifies as child sexual abuse material under both state and federal law. The filing requests a court order compelling the defendants to delete existing non‑consensual images, ceasing the tool’s operation, and prohibiting its use to train future AI models.
This case echoes a growing trend of AI misuse in intimate digital spaces. As noted by commentators, high‑school students increasingly exploit AI “undressing” apps to fabricate nude images of peers, which raises urgent concerns about adolescent privacy, consent, and emotional harm.
Legally, the incident highlights challenges in holding overseas or loosely regulated AI‑tool creators accountable. The company behind ClothOff is reportedly based in the British Virgin Islands and operated from Belarus.






