In an era where AI is reshaping almost every facet of our digital lives, the music industry is no exception. But for some artists, this technological evolution has brought an unwanted problem: AI-generated fake albums hijacking their Spotify pages, and potentially stealing their streaming revenue. As more and more musicians find their names associated with music that sounds nothing like their own, a growing scam is emerging on streaming platforms like Spotify. These fraudulent releases, often generated by artificial intelligence, exploit the platform’s content system, creating chaos for real artists and their fans alike.
Fake Albums: The New AI Plague
It all began innocuously enough—an album by the industrial rock band HEALTH popped up on a Spotify playlist, but something was off. The cover art seemed odd, and the music didn’t sound like HEALTH at all. This was no ordinary glitch; it turned out that the album was a fake, uploaded under the band’s name without their permission. In the days that followed, other artists began noticing the same thing: their names were being attached to bizarre, often unlistenable AI-generated albums.
The situation escalated when a new fake album appeared on Annie’s artist page. At first, it seemed plausible. Annie had recently released a single, and the album could have been a continuation of their work. But when listeners clicked on it, they were greeted with something entirely different: ambient birdsong and aimless, new-age instrumental music, which was a far cry from Annie’s established sound. For the artists themselves, it was a shock. “That was upsetting to me, because if you have ears, you can definitely hear it’s not our music,” Annie’s bandmate later explained.
Soon, this phenomenon began to affect many other artists, particularly those with single-word names like Swans, Gong, and Standards. Fake albums, often created through AI, started popping up on their Spotify pages, sometimes disappearing after a few days but, in other instances, lingering indefinitely, even after artists contacted the platform. For many musicians, these unauthorized releases represented not only a violation of their artistic identity but also a threat to their revenue streams.
The Mechanics of the Scam: How It Works
Understanding how these fake albums make their way onto Spotify is crucial to grasping the scope of the problem. Unlike social media platforms where users upload content directly, artists on Spotify usually work with a distributor—a third-party company that helps get their music onto streaming platforms. The distributor handles important tasks like licensing, submitting metadata (such as song titles and artist names), and ensuring artists get paid their due royalties.
However, this system, while convenient, operates on an honor system. When a distributor uploads music to Spotify, the platform takes the metadata at face value. So, if a fraudulent party manages to use the name of an artist like “Standards” and upload AI-generated music under their profile, Spotify might not catch it immediately. The streaming service assumes the information from the distributor is legitimate and leaves the fake album visible on the artist’s page until someone flags it.
While many of these fake releases are removed after being reported, the process often takes longer than expected. Marcos Mena, lead guitarist for Standards, described how a fake album was posted on his band’s page, and despite contacting Spotify, it stayed there for weeks before being taken down. “It’s definitely a bummer because we did have a new album come out this year, and I feel like it’s detracting from that,” Mena shared.
Motivation Behind the Fraud: The Money Trail
So, why would anyone go through the trouble of creating and uploading fake music under someone else’s name? The simple answer is money. Streaming platforms like Spotify offer tiny payouts for each song played, but when hundreds or even thousands of fake tracks are uploaded across various artists’ pages, those tiny amounts can add up quickly. The real problem arises when fraudulent tracks begin to get played by bots or unsuspecting listeners, and the royalties are funneled to scammers rather than the legitimate artists.
For instance, the fake albums uploaded under Standards’ name were connected to a label called Gupta Music, which had hundreds of similarly suspicious albums on Spotify. These albums, with their generic cover art and AI-generated sounds, appeared to be part of a coordinated effort to exploit Spotify’s royalty system. When listeners unknowingly stream these tracks, the money goes to the scammers instead of the actual creators.
In some cases, fraudsters target musicians with niche names or those whose fans might not be as familiar with their entire discography, making it easier for the fake albums to slip under the radar. The scam is also aided by the fact that many distributors—who are responsible for uploading tracks to Spotify—have business models based on taking a cut of royalties. These distributors may not be actively involved in the fraud, but they benefit from the streams generated by the fake music.
Spotify’s Response: Slow and Steady
Spotify, for its part, claims to take fraud prevention seriously. According to Chris Macowski, Spotify’s head of music communication, the platform invests heavily in both automated and manual reviews to catch such issues. When asked about the recent wave of AI-generated fake albums, Macowski confirmed that the company had cut ties with the distributor responsible for the fraudulent content, Ameritz Music. “Due to significant and repeated violations of Spotify’s Metadata Style Guide, we ended our relationship with the licensor that provided the content in question,” Macowski said.
Still, the fact remains that the system for identifying fraudulent uploads is imperfect. “The content validation system without any input on the artist level is fairly crazy,” said Glenn McDonald, a former Spotify employee. He pointed out that the system could be improved by flagging albums with metadata that differs from an artist’s usual distributor or label. But until these mechanisms are in place, musicians are left to fend for themselves, hoping that their fake albums will eventually be taken down.
Beyond Spotify: The Larger Streaming Fraud Problem
This issue of fraudulent music uploads isn’t unique to Spotify. Other platforms, like Apple Music and YouTube, also face similar challenges with streaming fraud, and the problem is only growing as more AI-generated content floods the market. AI tools allow scammers to quickly churn out vast amounts of music that mimics the styles of existing artists, making it easier for them to dupe listeners and streaming services alike.
According to Andrew Batey, CEO of Beatdapp, a company that combats streaming fraud, AI is just the latest tool in a long-standing problem. Fraudsters have previously used other methods—like digitizing old, obscure albums or slightly altering existing tracks—to steal streams. But AI has made it faster and cheaper for bad actors to create convincing fake albums in bulk. Estimates suggest that $2 to $3 billion is stolen from artists every year through such fraud.
The Way Forward: Solutions and Challenges
The solution to this growing issue is complex and may require changes both at the distributor and streaming service levels. One potential fix could involve stricter content verification processes, particularly for new artists or unfamiliar releases. Distributors, too, need to be more vigilant in vetting who they allow to upload music and under what conditions.
Ultimately, though, the issue of AI music scams highlights a larger problem within the streaming ecosystem: the lack of control artists have over their own content. While AI-generated music may be an accelerant for fraud, the underlying infrastructure that enables such scams has been around for years. Until that changes, artists will continue to face the threat of having their names attached to content that they had no hand in creating—putting their reputations, and their royalties, at risk.