Google AI Ruins Concert After Falsely Branding Musician a Sex Offender

A concert venue in Canada canceled fiddler Ashley MacIsaac after Google AI wrongly labeled him a sex offender, prompting the musician to consider a defamation lawsuit.

MacIsaac, known for blending Celtic music with modern sounds, was set to perform at the Sipekne’katik First Nation in Nova Scotia on December 19. However, the venue canceled the show after false information generated by Google AI surfaced online, according to NewsNation.

MacIsaac told the CBC that he was informed the erroneous Google AI summary identified him as a sex offender, leading the venue to pull the performance. He said the AI’s mistake stemmed from aggregating articles about another Canadian man with the same name.

“You are being put into a less secure situation because of a media company — that’s what defamation is,” MacIsaac said. He added that if a lawyer were willing to take his case pro bono, he would pursue legal action, saying, “I would stand up… because I’m not the first and I’m sure I won’t be the last.”

The 50‑year‑old musician told the St. Albert Gazette that he would be willing to sue Google for defamation if a lawyer would take the case for free. “It’s a very scary situation where, if I had gone to a border, I probably would have been still in jail,” he said. “I’ve been a public figure for years and there are stories written about me that are about marijuana and about me being gay, all that stuff… But when it comes to the serious nature of criminal offenses, it’s completely false. It’s completely wrong.”

The venue apologized for the error and emphasized its regret over the impact on MacIsaac’s reputation and livelihood. “We deeply regret the harm this caused to your reputation and livelihood,” the venue said, calling for “reconciliation” and praising MacIsaac’s artistry and cultural contributions.

A Google spokesperson told CTV that search and AI overviews are “dynamic and frequently changing to show the most helpful information.” The spokesman added that when errors occur — such as misinterpretations or missing context — the company uses those examples to refine its systems and may take action under its policies.

MORE STORIES