While the late Blaze Foley can't protest from beyond the grave, his musical legacy is being hijacked by AI imposters on Spotify. A fake song called "Together" recently appeared on the country legend's official page, complete with AI-generated artwork showing someone who looks nothing like Foley. The style? All wrong. The voice? Not even close. Just another digital ghost haunting the streaming platforms we trust.
Foley wasn't the only victim. Guy Clark and other deceased country artists had their pages corrupted by similar AI fakery. Record labels and estates are furious—and they should be. They're calling it what it is: fraud, plain and simple.
Spotify eventually yanked the counterfeit tracks, but only after fans and labels raised hell. They pointed fingers at SoundOn, TikTok's music distribution service, for allowing the uploads. The platform cited their "Deceptive Content" policy when removing the songs. Too little, too late? Probably. Their verification systems clearly have gaping holes big enough to drive a tour bus through.
The technical details remain murky. These digital hijackers likely exploited weaknesses in distribution platforms to plant their AI-generated garbage on legitimate artist pages. A mysterious entity called "Syntax Error" appears as the copyright owner on many of these fakes. Convenient name for a glitch in the music industry's matrix. The rise of AI-powered attacks has made such digital deceptions increasingly sophisticated and harder to detect.
For artists' estates, this isn't just annoying—it's devastating. Label owners describe these AI impersonations as "schlock" and "algorithmic fraud" that tarnish carefully preserved legacies. Craig McDonald, owner of Foley's record label, described the AI track as a schlock bot creation that completely misrepresented the artist's style. The incident highlights how AI technology creates serious ethical concerns when used to impersonate artists without their consent or their estate's permission. Fans get confused. History gets rewritten. All because some algorithm tried playing songwriter.
The legal landscape remains frustratingly vague. No clear laws specifically prohibit uploading AI music under a dead artist's name. It's ethically repugnant, sure, but the regulatory framework hasn't caught up.
Meanwhile, Blaze Foley's authentic voice gets a little harder to hear through the digital noise. The machines are singing his songs now. And nobody asked permission.

