AI Sextortion Scam Triggers Tragic Teen Suicide in Kentucky

A 16-year-old boy in Kentucky tragically took his own life after falling victim to an AI-generated sextortion scheme. Elijah Heacock of Glasgow received a message containing a fake nude image created with AI and a demand for $3,000 to prevent it from being sent to family and friends. Hours later, on February 28, Elijah died by suicide.

Elijah’s parents, John Burnett and Shannon Heacock, say they had no idea what their son was facing until they discovered the extortion messages on his phone. “The people that are after our children are well organized,” Burnett said. “They are well financed, and they are relentless. They don’t need the photos to be real—they can generate whatever they want.”

Sextortion, a growing threat, involves blackmailing minors by threatening to release sexual images—real or fake—unless victims provide more explicit material, engage in sexual activity, or pay money. According to the National Center for Missing and Exploited Children (NCMEC), over 500,000 cases were reported in the past year alone, with at least 20 young people having died by suicide from such schemes since 2021.

The rise of AI technology has made these scams even more dangerous, with over 100,000 sextortion reports this year alone involving AI-generated images.

President Donald Trump has taken action, signing the Take It Down Act into law on May 19, which makes it a federal crime to post or distribute sexually explicit material—real or AI-generated—without consent. The law also mandates social media platforms to remove such content within 48 hours of a victim’s request.

Elijah’s family hopes the new law will help prevent similar tragedies, but they know it’s just one step in a much larger battle. “No war is ever won by one bullet,” Burnett said. “You got to win battles. You got to win fights. And we’re in it.”

MORE STORIES